Dec 04 15:19:51 crc systemd[1]: Starting Kubernetes Kubelet... Dec 04 15:19:51 crc restorecon[4675]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:51 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 04 15:19:52 crc restorecon[4675]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 04 15:19:53 crc kubenswrapper[4676]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 15:19:53 crc kubenswrapper[4676]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 04 15:19:53 crc kubenswrapper[4676]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 15:19:53 crc kubenswrapper[4676]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 15:19:53 crc kubenswrapper[4676]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 04 15:19:53 crc kubenswrapper[4676]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.090896 4676 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094940 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094966 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094971 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094975 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094978 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094982 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094987 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094992 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.094997 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095002 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095007 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095037 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095045 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095050 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095055 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095059 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095064 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095067 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095071 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095075 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095078 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095082 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095085 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095089 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095093 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095096 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095101 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095105 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095110 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095114 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095118 4676 feature_gate.go:330] unrecognized feature gate: Example Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095123 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095128 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095132 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095136 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095141 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095145 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095149 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095154 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095158 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095164 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095169 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095175 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095181 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095187 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095191 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095195 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095201 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095205 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095209 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095213 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095217 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095223 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095229 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095234 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095239 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095244 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095249 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095254 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095258 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095264 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095270 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095277 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095282 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095286 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095291 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095295 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095299 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095304 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095309 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.095313 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095888 4676 flags.go:64] FLAG: --address="0.0.0.0" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095920 4676 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095932 4676 flags.go:64] FLAG: --anonymous-auth="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095940 4676 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095948 4676 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095953 4676 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095961 4676 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095968 4676 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095974 4676 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095979 4676 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095985 4676 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095990 4676 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.095997 4676 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096002 4676 flags.go:64] FLAG: --cgroup-root="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096007 4676 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096012 4676 flags.go:64] FLAG: --client-ca-file="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096017 4676 flags.go:64] FLAG: --cloud-config="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096022 4676 flags.go:64] FLAG: --cloud-provider="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096027 4676 flags.go:64] FLAG: --cluster-dns="[]" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096033 4676 flags.go:64] FLAG: --cluster-domain="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096038 4676 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096043 4676 flags.go:64] FLAG: --config-dir="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096048 4676 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096055 4676 flags.go:64] FLAG: --container-log-max-files="5" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096062 4676 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096067 4676 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096072 4676 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096078 4676 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096084 4676 flags.go:64] FLAG: --contention-profiling="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096089 4676 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096094 4676 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096099 4676 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096105 4676 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096111 4676 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096117 4676 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096123 4676 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096130 4676 flags.go:64] FLAG: --enable-load-reader="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096136 4676 flags.go:64] FLAG: --enable-server="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096141 4676 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096149 4676 flags.go:64] FLAG: --event-burst="100" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096154 4676 flags.go:64] FLAG: --event-qps="50" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096159 4676 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096164 4676 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096170 4676 flags.go:64] FLAG: --eviction-hard="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096177 4676 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096182 4676 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096187 4676 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096193 4676 flags.go:64] FLAG: --eviction-soft="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096199 4676 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096205 4676 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096211 4676 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096216 4676 flags.go:64] FLAG: --experimental-mounter-path="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096222 4676 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096227 4676 flags.go:64] FLAG: --fail-swap-on="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096233 4676 flags.go:64] FLAG: --feature-gates="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096240 4676 flags.go:64] FLAG: --file-check-frequency="20s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096245 4676 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096251 4676 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096256 4676 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096262 4676 flags.go:64] FLAG: --healthz-port="10248" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096268 4676 flags.go:64] FLAG: --help="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096273 4676 flags.go:64] FLAG: --hostname-override="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096278 4676 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096284 4676 flags.go:64] FLAG: --http-check-frequency="20s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096289 4676 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096295 4676 flags.go:64] FLAG: --image-credential-provider-config="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096300 4676 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096306 4676 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096312 4676 flags.go:64] FLAG: --image-service-endpoint="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096317 4676 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096323 4676 flags.go:64] FLAG: --kube-api-burst="100" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096328 4676 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096333 4676 flags.go:64] FLAG: --kube-api-qps="50" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096339 4676 flags.go:64] FLAG: --kube-reserved="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096344 4676 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096349 4676 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096354 4676 flags.go:64] FLAG: --kubelet-cgroups="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096359 4676 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096364 4676 flags.go:64] FLAG: --lock-file="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096369 4676 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096374 4676 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096380 4676 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096388 4676 flags.go:64] FLAG: --log-json-split-stream="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096393 4676 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096400 4676 flags.go:64] FLAG: --log-text-split-stream="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096406 4676 flags.go:64] FLAG: --logging-format="text" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096411 4676 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096417 4676 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096422 4676 flags.go:64] FLAG: --manifest-url="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096428 4676 flags.go:64] FLAG: --manifest-url-header="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096435 4676 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096440 4676 flags.go:64] FLAG: --max-open-files="1000000" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096447 4676 flags.go:64] FLAG: --max-pods="110" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096452 4676 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096457 4676 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096462 4676 flags.go:64] FLAG: --memory-manager-policy="None" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096467 4676 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096473 4676 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096478 4676 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096485 4676 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096499 4676 flags.go:64] FLAG: --node-status-max-images="50" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096504 4676 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096527 4676 flags.go:64] FLAG: --oom-score-adj="-999" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096533 4676 flags.go:64] FLAG: --pod-cidr="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096538 4676 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096545 4676 flags.go:64] FLAG: --pod-manifest-path="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096551 4676 flags.go:64] FLAG: --pod-max-pids="-1" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096557 4676 flags.go:64] FLAG: --pods-per-core="0" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096562 4676 flags.go:64] FLAG: --port="10250" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096567 4676 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096572 4676 flags.go:64] FLAG: --provider-id="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096577 4676 flags.go:64] FLAG: --qos-reserved="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096582 4676 flags.go:64] FLAG: --read-only-port="10255" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096587 4676 flags.go:64] FLAG: --register-node="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096592 4676 flags.go:64] FLAG: --register-schedulable="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096598 4676 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096606 4676 flags.go:64] FLAG: --registry-burst="10" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096612 4676 flags.go:64] FLAG: --registry-qps="5" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096616 4676 flags.go:64] FLAG: --reserved-cpus="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096621 4676 flags.go:64] FLAG: --reserved-memory="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096627 4676 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096631 4676 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096636 4676 flags.go:64] FLAG: --rotate-certificates="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096640 4676 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096644 4676 flags.go:64] FLAG: --runonce="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096648 4676 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096652 4676 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096657 4676 flags.go:64] FLAG: --seccomp-default="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096661 4676 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096665 4676 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096670 4676 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096675 4676 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096680 4676 flags.go:64] FLAG: --storage-driver-password="root" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096684 4676 flags.go:64] FLAG: --storage-driver-secure="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096688 4676 flags.go:64] FLAG: --storage-driver-table="stats" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096693 4676 flags.go:64] FLAG: --storage-driver-user="root" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096697 4676 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096702 4676 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096706 4676 flags.go:64] FLAG: --system-cgroups="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096711 4676 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096718 4676 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096722 4676 flags.go:64] FLAG: --tls-cert-file="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096726 4676 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096732 4676 flags.go:64] FLAG: --tls-min-version="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096736 4676 flags.go:64] FLAG: --tls-private-key-file="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096740 4676 flags.go:64] FLAG: --topology-manager-policy="none" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096745 4676 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096750 4676 flags.go:64] FLAG: --topology-manager-scope="container" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096756 4676 flags.go:64] FLAG: --v="2" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096763 4676 flags.go:64] FLAG: --version="false" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096769 4676 flags.go:64] FLAG: --vmodule="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096776 4676 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.096781 4676 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096922 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096931 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096936 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096943 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096949 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096954 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096959 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096964 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096968 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096973 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096977 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096981 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096990 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096994 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.096998 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097001 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097005 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097009 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097012 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097016 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097021 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097025 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097029 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097034 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097038 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097043 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097047 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097052 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097057 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097061 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097066 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097070 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097075 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097079 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097084 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097088 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097093 4676 feature_gate.go:330] unrecognized feature gate: Example Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097098 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097102 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097110 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097116 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097121 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097126 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097131 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097139 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097143 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097149 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097155 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097160 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097165 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097170 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097174 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097179 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097184 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097191 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097196 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097201 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097205 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097210 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097214 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097219 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097223 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097227 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097231 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097235 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097239 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097242 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097246 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097249 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097254 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.097258 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.097284 4676 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.110124 4676 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.110171 4676 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110272 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110284 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110288 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110293 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110297 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110302 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110309 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110313 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110316 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110321 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110325 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110328 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110332 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110336 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110341 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110345 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110351 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110355 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110358 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110362 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110367 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110371 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110374 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110378 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110381 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110385 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110389 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110392 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110396 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110399 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110402 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110407 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110411 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110416 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110420 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110423 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110427 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110430 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110434 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110438 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110443 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110447 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110452 4676 feature_gate.go:330] unrecognized feature gate: Example Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110456 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110459 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110463 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110467 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110471 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110474 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110478 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110481 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110486 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110490 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110494 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110497 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110501 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110504 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110508 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110511 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110515 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110518 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110522 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110526 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110529 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110533 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110536 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110540 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110543 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110547 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110551 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110554 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.110561 4676 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110678 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110686 4676 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110690 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110694 4676 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110698 4676 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110702 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110706 4676 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110710 4676 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110713 4676 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110717 4676 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110720 4676 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110724 4676 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110728 4676 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110732 4676 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110735 4676 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110739 4676 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110742 4676 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110746 4676 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110749 4676 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110753 4676 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110757 4676 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110760 4676 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110764 4676 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110769 4676 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110773 4676 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110778 4676 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110782 4676 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110787 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110792 4676 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110796 4676 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110800 4676 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110805 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110809 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110813 4676 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110816 4676 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110820 4676 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110825 4676 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110831 4676 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110836 4676 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110840 4676 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110845 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110849 4676 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110852 4676 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110856 4676 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110859 4676 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110863 4676 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110867 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110870 4676 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110874 4676 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110877 4676 feature_gate.go:330] unrecognized feature gate: Example Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110881 4676 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110884 4676 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110889 4676 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110894 4676 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110918 4676 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110925 4676 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110935 4676 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110941 4676 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110949 4676 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110957 4676 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110963 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110967 4676 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110972 4676 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110978 4676 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110981 4676 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110985 4676 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110988 4676 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110993 4676 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.110997 4676 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.111000 4676 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.111004 4676 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.111012 4676 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.111434 4676 server.go:940] "Client rotation is on, will bootstrap in background" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.114648 4676 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.114785 4676 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.115523 4676 server.go:997] "Starting client certificate rotation" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.115565 4676 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.115849 4676 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-09 14:08:38.951637896 +0000 UTC Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.116154 4676 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 862h48m45.835489967s for next certificate rotation Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.126892 4676 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.129172 4676 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.142024 4676 log.go:25] "Validated CRI v1 runtime API" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.168628 4676 log.go:25] "Validated CRI v1 image API" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.170420 4676 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.174133 4676 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-04-15-14-50-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.174226 4676 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.193935 4676 manager.go:217] Machine: {Timestamp:2025-12-04 15:19:53.190352268 +0000 UTC m=+0.625022155 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7171a43d-58aa-4be8-82e2-5e1d4cb4902b BootID:4574455b-7b00-4e77-9815-81145b03a6ca Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a3:e3:c0 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a3:e3:c0 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:32:50:fb Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2a:59:42 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:47:a8:33 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:dc:ee:0a Speed:-1 Mtu:1496} {Name:eth10 MacAddress:2e:06:63:f3:ac:34 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8a:e3:6d:5d:15:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.194208 4676 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.194518 4676 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.194992 4676 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.195222 4676 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.195268 4676 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.197219 4676 topology_manager.go:138] "Creating topology manager with none policy" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.197242 4676 container_manager_linux.go:303] "Creating device plugin manager" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.197439 4676 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.197467 4676 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.197683 4676 state_mem.go:36] "Initialized new in-memory state store" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.197796 4676 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.201113 4676 kubelet.go:418] "Attempting to sync node with API server" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.201135 4676 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.201175 4676 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.201195 4676 kubelet.go:324] "Adding apiserver pod source" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.201219 4676 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.212969 4676 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.213890 4676 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.215762 4676 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217427 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.217440 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217486 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217530 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.217526 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217543 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217562 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217574 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217588 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.217518 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217609 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217628 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217643 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.217652 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217670 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.217731 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.218690 4676 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.219255 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.219561 4676 server.go:1280] "Started kubelet" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.221618 4676 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.221390 4676 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e0c44b5940896 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 15:19:53.219500182 +0000 UTC m=+0.654170049,LastTimestamp:2025-12-04 15:19:53.219500182 +0000 UTC m=+0.654170049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.221709 4676 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 11:14:20.882285932 +0000 UTC Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.221763 4676 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 187h54m27.660526774s for next certificate rotation Dec 04 15:19:53 crc systemd[1]: Started Kubernetes Kubelet. Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.222163 4676 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.222626 4676 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.222195 4676 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.222860 4676 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.222979 4676 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.223152 4676 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.223166 4676 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.223276 4676 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.223974 4676 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.224004 4676 factory.go:55] Registering systemd factory Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.224016 4676 factory.go:221] Registration of the systemd container factory successfully Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.224005 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.224223 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.224269 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.224559 4676 factory.go:153] Registering CRI-O factory Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.224578 4676 factory.go:221] Registration of the crio container factory successfully Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.224643 4676 factory.go:103] Registering Raw factory Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.224704 4676 manager.go:1196] Started watching for new ooms in manager Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.225572 4676 manager.go:319] Starting recovery of all containers Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.225817 4676 server.go:460] "Adding debug handlers to kubelet server" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.324055 4676 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.342704 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.342858 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.342884 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.342928 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.342948 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.342975 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343046 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343134 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343163 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343209 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343231 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343250 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343293 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343317 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343332 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343379 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343398 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343442 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343463 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.343479 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.344735 4676 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.344845 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.344939 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345006 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345064 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345122 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345181 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345250 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345322 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345422 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345508 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.345791 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346208 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346243 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346261 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346276 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346291 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346307 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346319 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346338 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346352 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346366 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346389 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346401 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346414 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346426 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346435 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346446 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346480 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346492 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346502 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346512 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346523 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346547 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346565 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346585 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346600 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346614 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346628 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346639 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346651 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346663 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346673 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346684 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346695 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346706 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346719 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346731 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346742 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346753 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346764 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346776 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346787 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346798 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346809 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346820 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346832 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346844 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346854 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346866 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346876 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346886 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346897 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346924 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346934 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346946 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346959 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346970 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346981 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.346992 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347003 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347013 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347025 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347036 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347046 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347056 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347068 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347081 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347095 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347108 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347121 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347133 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347146 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347160 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347173 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347189 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347200 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347211 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347222 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347234 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347245 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347255 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347266 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347276 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347287 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347298 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347309 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347325 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347339 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347355 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347366 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347378 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347392 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347406 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347419 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347432 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347441 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347452 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347461 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347471 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347480 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347490 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347501 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347511 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347522 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347536 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347548 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347562 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347573 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347585 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347596 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347606 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347616 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347626 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347635 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347644 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347653 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347663 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347673 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347682 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347691 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347701 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347710 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347719 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347729 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347737 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347746 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347755 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347765 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347774 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347827 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347863 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347889 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347927 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347944 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347957 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347970 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347983 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.347995 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348005 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348015 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348025 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348035 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348044 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348053 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348066 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348075 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348085 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348095 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348104 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348114 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348125 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348135 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348146 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348156 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348166 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348176 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348187 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348197 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348208 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348218 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348228 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348238 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348248 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348260 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348271 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348281 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348291 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348301 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348318 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348328 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348338 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348348 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348358 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348367 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348377 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348387 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348398 4676 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348415 4676 reconstruct.go:97] "Volume reconstruction finished" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.348427 4676 reconciler.go:26] "Reconciler: start to sync state" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.357602 4676 manager.go:324] Recovery completed Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.368106 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.373400 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.373482 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.373501 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.374361 4676 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.374376 4676 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.374562 4676 state_mem.go:36] "Initialized new in-memory state store" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.380959 4676 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.382872 4676 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.382978 4676 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.383039 4676 kubelet.go:2335] "Starting kubelet main sync loop" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.383104 4676 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 04 15:19:53 crc kubenswrapper[4676]: W1204 15:19:53.386139 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.386240 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.386287 4676 policy_none.go:49] "None policy: Start" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.387620 4676 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.388028 4676 state_mem.go:35] "Initializing new in-memory state store" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.426875 4676 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.428005 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.483574 4676 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.502540 4676 manager.go:334] "Starting Device Plugin manager" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.502624 4676 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.502641 4676 server.go:79] "Starting device plugin registration server" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.503270 4676 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.503319 4676 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.503722 4676 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.503846 4676 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.503862 4676 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.512220 4676 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.604061 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.618245 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.618321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.618338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.618390 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.619204 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.684486 4676 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.684769 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.686316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.686348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.686357 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.686534 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.687169 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.687218 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.687990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688025 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688034 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688175 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688551 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688604 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688936 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688958 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.688967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.689643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.689669 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.689680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.689786 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.690168 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.690201 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.690498 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.690522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.690532 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691001 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691038 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691135 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691523 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691561 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691987 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.691995 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.692562 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.692582 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.692593 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.692728 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.692754 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.693114 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.693136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.693145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.693542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.693564 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.693575 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760282 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760370 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760467 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760514 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760555 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760602 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760654 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760702 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760732 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760760 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760791 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760818 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760846 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760897 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.760968 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.819957 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.821555 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.821640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.821661 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.821704 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.822452 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Dec 04 15:19:53 crc kubenswrapper[4676]: E1204 15:19:53.829447 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862459 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862506 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862527 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862549 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862569 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862591 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862611 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862634 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862657 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862679 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862698 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862719 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862784 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862795 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862830 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862760 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862868 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862719 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862832 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862751 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862787 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862762 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.863024 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.862868 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.863049 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.863042 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.863128 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.863078 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.863191 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:53 crc kubenswrapper[4676]: I1204 15:19:53.863318 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.022066 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.040937 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.049443 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.067470 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7a7672418bf717dcb57335b9f5ccbb6b4d910c660e6e82999163c565a80c62e1 WatchSource:0}: Error finding container 7a7672418bf717dcb57335b9f5ccbb6b4d910c660e6e82999163c565a80c62e1: Status 404 returned error can't find the container with id 7a7672418bf717dcb57335b9f5ccbb6b4d910c660e6e82999163c565a80c62e1 Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.068722 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.069267 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2043f3095df630a2fadd0cd1f6ffea21c75cb43346774c6de5b080478f0a6445 WatchSource:0}: Error finding container 2043f3095df630a2fadd0cd1f6ffea21c75cb43346774c6de5b080478f0a6445: Status 404 returned error can't find the container with id 2043f3095df630a2fadd0cd1f6ffea21c75cb43346774c6de5b080478f0a6445 Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.073001 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-6fe24cfcce5ce2ead3271b89a62860c4656d890c2d91a50a73c642752ec6788d WatchSource:0}: Error finding container 6fe24cfcce5ce2ead3271b89a62860c4656d890c2d91a50a73c642752ec6788d: Status 404 returned error can't find the container with id 6fe24cfcce5ce2ead3271b89a62860c4656d890c2d91a50a73c642752ec6788d Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.073856 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.081187 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-ccab69ae2b219508526027819ed9a2ec1de5d184cd4b85f8d1d1e1d0e94ab2f4 WatchSource:0}: Error finding container ccab69ae2b219508526027819ed9a2ec1de5d184cd4b85f8d1d1e1d0e94ab2f4: Status 404 returned error can't find the container with id ccab69ae2b219508526027819ed9a2ec1de5d184cd4b85f8d1d1e1d0e94ab2f4 Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.220468 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.222559 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.224442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.224476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.224488 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.224515 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:19:54 crc kubenswrapper[4676]: E1204 15:19:54.224869 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.281443 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:54 crc kubenswrapper[4676]: E1204 15:19:54.281579 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.318585 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:54 crc kubenswrapper[4676]: E1204 15:19:54.318690 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.386876 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6fe24cfcce5ce2ead3271b89a62860c4656d890c2d91a50a73c642752ec6788d"} Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.388289 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2043f3095df630a2fadd0cd1f6ffea21c75cb43346774c6de5b080478f0a6445"} Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.389283 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7a7672418bf717dcb57335b9f5ccbb6b4d910c660e6e82999163c565a80c62e1"} Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.390342 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3ec4f17ebbb0ae62c58b07461d628f98290c45257f0abc1b8a8ff098583f83f2"} Dec 04 15:19:54 crc kubenswrapper[4676]: I1204 15:19:54.391275 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ccab69ae2b219508526027819ed9a2ec1de5d184cd4b85f8d1d1e1d0e94ab2f4"} Dec 04 15:19:54 crc kubenswrapper[4676]: E1204 15:19:54.630426 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.634591 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:54 crc kubenswrapper[4676]: E1204 15:19:54.634708 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:54 crc kubenswrapper[4676]: W1204 15:19:54.718700 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:54 crc kubenswrapper[4676]: E1204 15:19:54.718827 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.026017 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.028301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.028381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.028439 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.028501 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:19:55 crc kubenswrapper[4676]: E1204 15:19:55.029378 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.220430 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.395861 4676 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9" exitCode=0 Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.395954 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9"} Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.396009 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.397011 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.397046 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.397059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.399803 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574"} Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.399847 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab"} Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.399860 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86"} Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.401088 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e" exitCode=0 Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.401220 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.401321 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e"} Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.401937 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.401962 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.401972 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.403198 4676 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c8b191d6338793c17ccd1b13a3f0da5b52edf268ac4e62ed44a23bc13987f498" exitCode=0 Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.403243 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c8b191d6338793c17ccd1b13a3f0da5b52edf268ac4e62ed44a23bc13987f498"} Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.403325 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.403808 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.404348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.404369 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.404379 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.404694 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.404715 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.404725 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.406183 4676 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71" exitCode=0 Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.406210 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71"} Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.406264 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.420307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.420344 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:55 crc kubenswrapper[4676]: I1204 15:19:55.420355 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:56 crc kubenswrapper[4676]: E1204 15:19:56.282705 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.282805 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:56 crc kubenswrapper[4676]: W1204 15:19:56.449566 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:56 crc kubenswrapper[4676]: E1204 15:19:56.449710 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.457027 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f"} Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.457094 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447"} Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.490110 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c"} Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.490358 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.491308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.491332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.491343 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.492695 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f"} Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.494240 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9"} Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.497377 4676 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0aa339f2bf96f8fdf32781adeb6fed4e4fb34bd3c7f954da5b8fcaeb737ba622" exitCode=0 Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.497410 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0aa339f2bf96f8fdf32781adeb6fed4e4fb34bd3c7f954da5b8fcaeb737ba622"} Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.497701 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.500798 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.500836 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.500848 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.506293 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cf579ed5bf7237ca102c3239090f593aa508f224de04b9c0b080aff84cc8afe5"} Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.506596 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.509070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.509120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.509130 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:56 crc kubenswrapper[4676]: W1204 15:19:56.610055 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:56 crc kubenswrapper[4676]: E1204 15:19:56.610171 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.701597 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.723870 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.723944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.723956 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:56 crc kubenswrapper[4676]: I1204 15:19:56.723994 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:19:56 crc kubenswrapper[4676]: E1204 15:19:56.733011 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.158:6443: connect: connection refused" node="crc" Dec 04 15:19:56 crc kubenswrapper[4676]: E1204 15:19:56.964725 4676 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187e0c44b5940896 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 15:19:53.219500182 +0000 UTC m=+0.654170049,LastTimestamp:2025-12-04 15:19:53.219500182 +0000 UTC m=+0.654170049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.220623 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.516036 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d"} Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.516083 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.519008 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.519066 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.519084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.523721 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e"} Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.523791 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c"} Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.527121 4676 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e4597026983ab596353c0171c413634e779b2a5a6c0a47355a52f1d34d510414" exitCode=0 Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.527292 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.527281 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e4597026983ab596353c0171c413634e779b2a5a6c0a47355a52f1d34d510414"} Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.527348 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.527292 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529563 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529626 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529637 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:57 crc kubenswrapper[4676]: I1204 15:19:57.529883 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:57 crc kubenswrapper[4676]: W1204 15:19:57.616636 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:57 crc kubenswrapper[4676]: E1204 15:19:57.616783 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:57 crc kubenswrapper[4676]: W1204 15:19:57.621213 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:57 crc kubenswrapper[4676]: E1204 15:19:57.621368 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.158:6443: connect: connection refused" logger="UnhandledError" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.221264 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.158:6443: connect: connection refused Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.223316 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.229329 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.540031 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597"} Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.540197 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.541994 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.542058 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.542078 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.544539 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4d760bf5c5edfb0f89ca23f9584b46772f56a46c99e2e3eec42fa3552f6a3a8a"} Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.544566 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.544579 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1e26ea373751878911cddde1e998fbcf4aa2d973b19212b3e90b1685fd62faa6"} Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.544678 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.544739 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.545981 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.546042 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.546056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.549631 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.549669 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:58 crc kubenswrapper[4676]: I1204 15:19:58.549684 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.552358 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.552407 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.552430 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.552431 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.552567 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.553875 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f3fe8bd56d1361529fe625e816fd98470819fc427d15c54cc3fe676dae9dbabb"} Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.553946 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"26dbcc545b2a51577cdc49f063efc56788a8022f0a41a8bffd9cce46263f3997"} Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.553960 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d667e3b15a9f25dae9977786e50d5b684598c396e45c0cabd3b4f590e435b4c"} Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.554050 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.554101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.554114 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.554122 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.554151 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.554164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.555350 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.555381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.555391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.934155 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.935360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.935402 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.935417 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:19:59 crc kubenswrapper[4676]: I1204 15:19:59.935446 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:20:00 crc kubenswrapper[4676]: I1204 15:20:00.555279 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:00 crc kubenswrapper[4676]: I1204 15:20:00.556628 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:00 crc kubenswrapper[4676]: I1204 15:20:00.556674 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:00 crc kubenswrapper[4676]: I1204 15:20:00.556688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.012525 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.012691 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.012736 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.013978 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.014078 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.014161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.267254 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.558064 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.558119 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.558826 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.558863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.558876 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.932615 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.932789 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.932837 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.934361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.934402 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:01 crc kubenswrapper[4676]: I1204 15:20:01.934415 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.212036 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.220406 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.390449 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.390658 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.391822 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.391854 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.391863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.560560 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.560744 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.562232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.562291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.562307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.566005 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.566077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.566099 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.630628 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.630899 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.632760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.632926 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:02 crc kubenswrapper[4676]: I1204 15:20:02.632947 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.195421 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.381148 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.381672 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.382874 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.382988 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.383050 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:03 crc kubenswrapper[4676]: E1204 15:20:03.512490 4676 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.563556 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.565149 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.565190 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:03 crc kubenswrapper[4676]: I1204 15:20:03.565204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:06 crc kubenswrapper[4676]: I1204 15:20:06.196282 4676 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 15:20:06 crc kubenswrapper[4676]: I1204 15:20:06.196444 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.296208 4676 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 04 15:20:09 crc kubenswrapper[4676]: E1204 15:20:09.533391 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.923251 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.928735 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597"} Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.928739 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597" exitCode=255 Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.929710 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.931708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.931776 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.931790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:09 crc kubenswrapper[4676]: I1204 15:20:09.934168 4676 scope.go:117] "RemoveContainer" containerID="3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597" Dec 04 15:20:09 crc kubenswrapper[4676]: E1204 15:20:09.939124 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 04 15:20:10 crc kubenswrapper[4676]: W1204 15:20:10.158562 4676 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 04 15:20:10 crc kubenswrapper[4676]: I1204 15:20:10.159623 4676 trace.go:236] Trace[837056711]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 15:20:00.157) (total time: 10001ms): Dec 04 15:20:10 crc kubenswrapper[4676]: Trace[837056711]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:20:10.158) Dec 04 15:20:10 crc kubenswrapper[4676]: Trace[837056711]: [10.001714994s] [10.001714994s] END Dec 04 15:20:10 crc kubenswrapper[4676]: E1204 15:20:10.159966 4676 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 04 15:20:10 crc kubenswrapper[4676]: I1204 15:20:10.936829 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 15:20:10 crc kubenswrapper[4676]: I1204 15:20:10.940344 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58"} Dec 04 15:20:10 crc kubenswrapper[4676]: I1204 15:20:10.940869 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:10 crc kubenswrapper[4676]: I1204 15:20:10.942942 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:10 crc kubenswrapper[4676]: I1204 15:20:10.943002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:10 crc kubenswrapper[4676]: I1204 15:20:10.943017 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:11 crc kubenswrapper[4676]: I1204 15:20:11.267808 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" start-of-body= Dec 04 15:20:11 crc kubenswrapper[4676]: I1204 15:20:11.268083 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded" Dec 04 15:20:11 crc kubenswrapper[4676]: I1204 15:20:11.329429 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 04 15:20:11 crc kubenswrapper[4676]: I1204 15:20:11.329537 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.221325 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.221779 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.226432 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.226652 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.228140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.228206 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.228236 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.228771 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.229100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:12 crc kubenswrapper[4676]: I1204 15:20:12.229312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.409259 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.409568 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.411174 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.411232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.411249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.424807 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 04 15:20:13 crc kubenswrapper[4676]: E1204 15:20:13.513174 4676 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.952001 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.953087 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.953154 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:13 crc kubenswrapper[4676]: I1204 15:20:13.953170 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.196994 4676 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.197125 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.275450 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.275696 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.277347 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.277391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.277406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.280172 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.336224 4676 trace.go:236] Trace[495430704]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 15:20:02.568) (total time: 13768ms): Dec 04 15:20:16 crc kubenswrapper[4676]: Trace[495430704]: ---"Objects listed" error: 13767ms (15:20:16.335) Dec 04 15:20:16 crc kubenswrapper[4676]: Trace[495430704]: [13.768020122s] [13.768020122s] END Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.336630 4676 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.337621 4676 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.338432 4676 trace.go:236] Trace[495061020]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 15:20:02.280) (total time: 14057ms): Dec 04 15:20:16 crc kubenswrapper[4676]: Trace[495061020]: ---"Objects listed" error: 14056ms (15:20:16.337) Dec 04 15:20:16 crc kubenswrapper[4676]: Trace[495061020]: [14.057003581s] [14.057003581s] END Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.338494 4676 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.339450 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.343884 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.343978 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.343998 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.344266 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.345033 4676 trace.go:236] Trace[1218161566]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (04-Dec-2025 15:20:01.879) (total time: 14465ms): Dec 04 15:20:16 crc kubenswrapper[4676]: Trace[1218161566]: ---"Objects listed" error: 14465ms (15:20:16.344) Dec 04 15:20:16 crc kubenswrapper[4676]: Trace[1218161566]: [14.465845417s] [14.465845417s] END Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.345062 4676 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 15:20:16 crc kubenswrapper[4676]: E1204 15:20:16.352122 4676 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 04 15:20:16 crc kubenswrapper[4676]: I1204 15:20:16.588563 4676 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.301067 4676 apiserver.go:52] "Watching apiserver" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.306725 4676 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.307414 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.308413 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.308418 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.308770 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.308565 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.308432 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.309035 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.308615 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.308557 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.309216 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.311583 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.311971 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.312164 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.312541 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.312594 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.312742 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.312807 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.312925 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.313307 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.327493 4676 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346375 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346444 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346483 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346543 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346569 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346590 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346612 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346635 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346652 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346669 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346689 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346736 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346762 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346785 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346805 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346828 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346846 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346877 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346897 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347017 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347041 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347064 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347100 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347119 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347137 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347159 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347181 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347260 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347281 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347298 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347319 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347349 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347381 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347406 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347434 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347474 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347512 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347539 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347565 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347593 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347616 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347644 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347664 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347798 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347826 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347847 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347873 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347896 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347935 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347974 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347993 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348041 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348081 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348099 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348116 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348134 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348165 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348189 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348209 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346874 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348302 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348329 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346874 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348336 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348454 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348488 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348511 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348533 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348561 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348589 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348617 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348645 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348670 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348697 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348727 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348755 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348797 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348845 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348870 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348893 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348938 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348959 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348981 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349001 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349020 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349045 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349066 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349084 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349101 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349120 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349140 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349159 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349179 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349207 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349224 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349247 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349266 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349290 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349310 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349335 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349354 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349376 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349398 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349420 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349443 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349467 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349502 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349528 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349613 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349648 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349677 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349704 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349730 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349758 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349784 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349808 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349834 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349861 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349885 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350013 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350056 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350083 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350111 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350137 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350163 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350186 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350207 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350225 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350242 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350259 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350275 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350308 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350335 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350371 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350390 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350408 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350428 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350447 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350468 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350489 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350509 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350530 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350563 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350591 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350623 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350646 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350670 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350697 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350722 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350748 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350771 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350793 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351153 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351191 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351216 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351243 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351271 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351338 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351482 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351506 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351529 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351553 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351697 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351740 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351767 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351808 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351835 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351861 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351892 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351942 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351973 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352004 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352054 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352086 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352112 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352998 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353045 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353086 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353116 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353153 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353267 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353299 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353331 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353447 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353476 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353501 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353521 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353543 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353561 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353586 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353617 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353636 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353654 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353673 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353694 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353720 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353740 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353764 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354056 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354105 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354136 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354249 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354337 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354380 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354434 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354474 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354592 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354646 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354755 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354802 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354955 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354974 4676 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354987 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.346970 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347034 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347127 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347161 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347293 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347305 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347375 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347539 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347573 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347666 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.361700 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347840 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347965 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347977 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.347985 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348040 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348169 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348169 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348202 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348218 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348554 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348606 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348601 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348627 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.348780 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349058 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349074 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349241 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349261 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349612 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349752 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.349841 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350178 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350654 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350674 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.350863 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351591 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.351825 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352063 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352086 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352250 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352339 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.352936 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353234 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353440 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.353788 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354157 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354439 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354731 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354773 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.354844 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.355108 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.355455 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.355773 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.355832 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.356417 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.356774 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.357006 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.357160 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.357185 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.357539 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.358228 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.361044 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.360896 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.361352 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.361377 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.361836 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.362003 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.362172 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.362160 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.362473 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.362862 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.362877 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.362995 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.363342 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.366012 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.366609 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.366637 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.367186 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.367191 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.367217 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.367537 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.367665 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.367990 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.368084 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.368165 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.368416 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.368479 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.368428 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.368656 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.368925 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:09Z\\\",\\\"message\\\":\\\"W1204 15:19:58.223447 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1204 15:19:58.224253 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764861598 cert, and key in /tmp/serving-cert-2614641796/serving-signer.crt, /tmp/serving-cert-2614641796/serving-signer.key\\\\nI1204 15:19:58.658314 1 observer_polling.go:159] Starting file observer\\\\nW1204 15:19:58.661547 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1204 15:19:58.662089 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:19:58.688040 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2614641796/tls.crt::/tmp/serving-cert-2614641796/tls.key\\\\\\\"\\\\nF1204 15:20:09.299706 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.369429 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.370026 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.370293 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.370699 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.370748 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.370820 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.371221 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.371411 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.371466 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.371613 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.372032 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.372106 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.372157 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.372210 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.372897 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.373224 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.373353 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.373356 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.373782 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.373876 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.374161 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.374571 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.374759 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.374768 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.374844 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.374967 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:17.87480816 +0000 UTC m=+25.309478017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.375120 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.375354 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.375365 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.375546 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.375625 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.375685 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.376285 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.376669 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.376829 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.376884 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.377207 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.377446 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.377609 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.377977 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.378101 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.378351 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.378735 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.378834 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.378700 4676 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.379049 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.379615 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.379871 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.380028 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.380149 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.380198 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.380409 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.380467 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.380591 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.380887 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.381069 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.381185 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.381226 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.381526 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.381558 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.381687 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.381833 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.382238 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.382303 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.382497 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.382548 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.383382 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:17.883356758 +0000 UTC m=+25.318026615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.383543 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.385725 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.386355 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.386744 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.387156 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.387156 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.387328 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.387696 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.388574 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.388858 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.410082 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.389280 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.389319 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.390084 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.402042 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.410219 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.410243 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.390953 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.396386 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.395014 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.404133 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.392084 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.392547 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.393669 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.394301 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.395148 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.396238 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.396295 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.399901 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:20:17.899862487 +0000 UTC m=+25.334532344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.410713 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:17.910684491 +0000 UTC m=+25.345354348 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.399937 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.400156 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.401846 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.402494 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.406250 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.407050 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.407346 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.408597 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.410845 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.410857 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.410892 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:17.910883597 +0000 UTC m=+25.345553664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.409077 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.412097 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.413475 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.414296 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.415371 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.415633 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.416804 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.417834 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.419234 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.420196 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.422456 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.424780 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.425794 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.427001 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.427636 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.428781 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.429349 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.428303 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.429618 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.430115 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.435420 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.437478 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.438144 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.438180 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.438219 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.444998 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.446746 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:09Z\\\",\\\"message\\\":\\\"W1204 15:19:58.223447 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1204 15:19:58.224253 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764861598 cert, and key in /tmp/serving-cert-2614641796/serving-signer.crt, /tmp/serving-cert-2614641796/serving-signer.key\\\\nI1204 15:19:58.658314 1 observer_polling.go:159] Starting file observer\\\\nW1204 15:19:58.661547 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1204 15:19:58.662089 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:19:58.688040 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2614641796/tls.crt::/tmp/serving-cert-2614641796/tls.key\\\\\\\"\\\\nF1204 15:20:09.299706 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.447414 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.451083 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.457944 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458599 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458718 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458828 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458852 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458869 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458884 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458896 4676 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458954 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458968 4676 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458981 4676 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.458993 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459006 4676 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459019 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459040 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459054 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459067 4676 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459081 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459121 4676 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459137 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459151 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459169 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459182 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459195 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459208 4676 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459221 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459235 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459247 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459260 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459272 4676 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459285 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459297 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459311 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459323 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459335 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459347 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459359 4676 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459371 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459383 4676 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459399 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459414 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459425 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459437 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459450 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459464 4676 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459478 4676 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459490 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459502 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459513 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459528 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459541 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459552 4676 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459566 4676 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459579 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459591 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459602 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459614 4676 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459757 4676 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459774 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459788 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459882 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459897 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459934 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459947 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459959 4676 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459972 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.459988 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460000 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460012 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460023 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460036 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460048 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460060 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460095 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460109 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460121 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460134 4676 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460146 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460187 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460199 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460211 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460224 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460237 4676 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460251 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460263 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460277 4676 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460289 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460302 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460314 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460326 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460338 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460349 4676 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460360 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460370 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460382 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460394 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460405 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460417 4676 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460428 4676 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460440 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460452 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460463 4676 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460476 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460486 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460496 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460508 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460521 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460533 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460546 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460557 4676 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460572 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460585 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460599 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460612 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460624 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460638 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460650 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460664 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.460853 4676 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461202 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461233 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461405 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461432 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461446 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461456 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461467 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461477 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461528 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461550 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461562 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461576 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461590 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461603 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461616 4676 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461629 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461644 4676 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461658 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461672 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461684 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461697 4676 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461710 4676 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461722 4676 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461736 4676 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461748 4676 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461760 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461773 4676 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461798 4676 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461812 4676 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461824 4676 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461837 4676 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461848 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461860 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461872 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461884 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461895 4676 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461927 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461942 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461954 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461967 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461981 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.461994 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462007 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462019 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462033 4676 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462048 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462061 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462092 4676 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462106 4676 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462119 4676 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462132 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462147 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462159 4676 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462172 4676 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462183 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462196 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462207 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462218 4676 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462233 4676 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462245 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462258 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462270 4676 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462283 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462296 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462307 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462333 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462345 4676 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462357 4676 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462368 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462379 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462391 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462409 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462422 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462435 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.462665 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.463003 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.463965 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.464858 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.466509 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.470718 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.472410 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.476564 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.478099 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.478351 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.479232 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.481055 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.482594 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.483147 4676 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.483276 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.485532 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.487418 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.487954 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.490126 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.492277 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.493120 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.494117 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.495715 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.496656 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.498091 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.500171 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.501050 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.502573 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.503738 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.505655 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.506929 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.508181 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.509072 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.509897 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.511082 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.511729 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.512948 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.526878 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.542284 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.568809 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.568844 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.568854 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.574714 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.591690 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.607228 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.641779 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.643216 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.653333 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 04 15:20:17 crc kubenswrapper[4676]: W1204 15:20:17.747393 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-4b03be4d907e5e9c851431f9990d709f363d01fa44a7f1975bfb8c76ab22c9ff WatchSource:0}: Error finding container 4b03be4d907e5e9c851431f9990d709f363d01fa44a7f1975bfb8c76ab22c9ff: Status 404 returned error can't find the container with id 4b03be4d907e5e9c851431f9990d709f363d01fa44a7f1975bfb8c76ab22c9ff Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.875782 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.876030 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.876177 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:18.876151223 +0000 UTC m=+26.310821080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.976221 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.976549 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:20:18.976490345 +0000 UTC m=+26.411160222 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.976665 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.976778 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.976843 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.976923 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.976961 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.976983 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.977044 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.977082 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:18.977050602 +0000 UTC m=+26.411720459 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.977116 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:18.977102733 +0000 UTC m=+26.411772590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.977156 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.977211 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.977241 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.977310 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:18.977286669 +0000 UTC m=+26.411956526 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.982825 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"54d65462d6b9fd423b829d631565e40d25b0b995f0117a7715d44238948ec260"} Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.983976 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"42152f8d83ec27a894b0e07e0ad50cf3db2d4bf0592badc8099ec1fcdccdefa2"} Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.987317 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.988037 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.990211 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58" exitCode=255 Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.990316 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58"} Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.990542 4676 scope.go:117] "RemoveContainer" containerID="3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.991526 4676 scope.go:117] "RemoveContainer" containerID="fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58" Dec 04 15:20:17 crc kubenswrapper[4676]: E1204 15:20:17.991853 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 04 15:20:17 crc kubenswrapper[4676]: I1204 15:20:17.994254 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4b03be4d907e5e9c851431f9990d709f363d01fa44a7f1975bfb8c76ab22c9ff"} Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.310831 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3cacaebe22ca69d4b637935b964e6c53b11f8aed3516435fcb05ba2acfdd4597\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:09Z\\\",\\\"message\\\":\\\"W1204 15:19:58.223447 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1204 15:19:58.224253 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764861598 cert, and key in /tmp/serving-cert-2614641796/serving-signer.crt, /tmp/serving-cert-2614641796/serving-signer.key\\\\nI1204 15:19:58.658314 1 observer_polling.go:159] Starting file observer\\\\nW1204 15:19:58.661547 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1204 15:19:58.662089 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:19:58.688040 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2614641796/tls.crt::/tmp/serving-cert-2614641796/tls.key\\\\\\\"\\\\nF1204 15:20:09.299706 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.356460 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.381619 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.396279 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.411289 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.433946 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.448551 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.978530 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.978656 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.978688 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.978712 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.978730 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.978893 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.978951 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.978960 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.979002 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:20:20.978976245 +0000 UTC m=+28.413646102 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.978966 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.979029 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:20.979017626 +0000 UTC m=+28.413687483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.978921 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.979045 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:20.979037897 +0000 UTC m=+28.413707754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.979063 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:20.979057598 +0000 UTC m=+28.413727455 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.978892 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.979100 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.979108 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:18 crc kubenswrapper[4676]: E1204 15:20:18.979135 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:20.97913001 +0000 UTC m=+28.413799857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:18 crc kubenswrapper[4676]: I1204 15:20:18.998917 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.001974 4676 scope.go:117] "RemoveContainer" containerID="fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58" Dec 04 15:20:19 crc kubenswrapper[4676]: E1204 15:20:19.002294 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.002852 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33"} Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.004416 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4"} Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.004445 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365"} Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.051181 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.265444 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.317292 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.330540 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.351930 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.386849 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:19 crc kubenswrapper[4676]: E1204 15:20:19.387045 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.387495 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:19 crc kubenswrapper[4676]: E1204 15:20:19.387555 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.387607 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:19 crc kubenswrapper[4676]: E1204 15:20:19.387728 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.389812 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.390880 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.392224 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.393297 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.395083 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.395848 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.521032 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.538738 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.545743 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9bc4z"] Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.546362 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5s6p9"] Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.546830 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.547418 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.549652 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.549937 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.550183 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.550357 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.550713 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.550886 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.551116 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.553898 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.594638 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:19Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.617764 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:19Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.632746 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b3eca9b5-0269-40ad-8bc1-142e702d9454-rootfs\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.632804 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x6p4\" (UniqueName: \"kubernetes.io/projected/0eaaf25e-b575-426f-9967-d81ac3c882ee-kube-api-access-7x6p4\") pod \"node-resolver-9bc4z\" (UID: \"0eaaf25e-b575-426f-9967-d81ac3c882ee\") " pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.632919 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3eca9b5-0269-40ad-8bc1-142e702d9454-proxy-tls\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.632965 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh2vc\" (UniqueName: \"kubernetes.io/projected/b3eca9b5-0269-40ad-8bc1-142e702d9454-kube-api-access-lh2vc\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.632989 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0eaaf25e-b575-426f-9967-d81ac3c882ee-hosts-file\") pod \"node-resolver-9bc4z\" (UID: \"0eaaf25e-b575-426f-9967-d81ac3c882ee\") " pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.633032 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3eca9b5-0269-40ad-8bc1-142e702d9454-mcd-auth-proxy-config\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.641197 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:19Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.660864 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:19Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.680152 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:19Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.700760 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:19Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.726483 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:19Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.733950 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x6p4\" (UniqueName: \"kubernetes.io/projected/0eaaf25e-b575-426f-9967-d81ac3c882ee-kube-api-access-7x6p4\") pod \"node-resolver-9bc4z\" (UID: \"0eaaf25e-b575-426f-9967-d81ac3c882ee\") " pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.734288 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3eca9b5-0269-40ad-8bc1-142e702d9454-proxy-tls\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.734459 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh2vc\" (UniqueName: \"kubernetes.io/projected/b3eca9b5-0269-40ad-8bc1-142e702d9454-kube-api-access-lh2vc\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.734581 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0eaaf25e-b575-426f-9967-d81ac3c882ee-hosts-file\") pod \"node-resolver-9bc4z\" (UID: \"0eaaf25e-b575-426f-9967-d81ac3c882ee\") " pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.734695 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3eca9b5-0269-40ad-8bc1-142e702d9454-mcd-auth-proxy-config\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.734806 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b3eca9b5-0269-40ad-8bc1-142e702d9454-rootfs\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.734797 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0eaaf25e-b575-426f-9967-d81ac3c882ee-hosts-file\") pod \"node-resolver-9bc4z\" (UID: \"0eaaf25e-b575-426f-9967-d81ac3c882ee\") " pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.734871 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b3eca9b5-0269-40ad-8bc1-142e702d9454-rootfs\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.735761 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b3eca9b5-0269-40ad-8bc1-142e702d9454-mcd-auth-proxy-config\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:19 crc kubenswrapper[4676]: I1204 15:20:19.742873 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b3eca9b5-0269-40ad-8bc1-142e702d9454-proxy-tls\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.274970 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh2vc\" (UniqueName: \"kubernetes.io/projected/b3eca9b5-0269-40ad-8bc1-142e702d9454-kube-api-access-lh2vc\") pod \"machine-config-daemon-5s6p9\" (UID: \"b3eca9b5-0269-40ad-8bc1-142e702d9454\") " pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.275045 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x6p4\" (UniqueName: \"kubernetes.io/projected/0eaaf25e-b575-426f-9967-d81ac3c882ee-kube-api-access-7x6p4\") pod \"node-resolver-9bc4z\" (UID: \"0eaaf25e-b575-426f-9967-d81ac3c882ee\") " pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.289881 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.332215 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.344286 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-f8vjl"] Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.345255 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.350877 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-wch9m"] Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.351405 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.352377 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.352580 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.352743 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.353058 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.353960 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.354274 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.355717 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.371265 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wmbt2"] Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.372355 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.381562 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-slash\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.381622 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-netd\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.381650 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-os-release\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.381674 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-kubelet\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.381880 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-kubelet\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382023 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-env-overrides\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382094 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-os-release\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382123 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cni-binary-copy\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382153 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-systemd\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-cni-bin\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382259 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-cnibin\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382277 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-etc-kubernetes\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382312 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-system-cni-dir\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382360 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-var-lib-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382381 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-ovn\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382527 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382589 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j6vk\" (UniqueName: \"kubernetes.io/projected/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-kube-api-access-6j6vk\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382669 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382689 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-bin\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382705 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-system-cni-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382722 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-cni-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382765 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a201486-d4f3-4677-adad-4028d94e0623-multus-daemon-config\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382789 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-node-log\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382938 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-systemd-units\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.382960 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-multus-certs\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383101 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frh24\" (UniqueName: \"kubernetes.io/projected/3f9795f2-fd74-48a2-af9c-90e7d47ab178-kube-api-access-frh24\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383263 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-log-socket\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383302 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a201486-d4f3-4677-adad-4028d94e0623-cni-binary-copy\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383429 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-socket-dir-parent\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383466 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-etc-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383614 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovn-node-metrics-cert\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383647 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-netns\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383788 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-hostroot\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383815 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-netns\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383955 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.383973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-script-lib\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384090 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-cni-multus\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384135 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cnibin\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384256 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384291 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-config\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384348 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-k8s-cni-cncf-io\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384500 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxq8f\" (UniqueName: \"kubernetes.io/projected/2a201486-d4f3-4677-adad-4028d94e0623-kube-api-access-wxq8f\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384571 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-conf-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.384612 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.407882 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.408092 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.408243 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.408445 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.411565 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.411997 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.415544 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.416555 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.466213 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.479773 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485403 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-env-overrides\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485460 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-os-release\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485494 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cni-binary-copy\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485515 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-systemd\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485538 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-cni-bin\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485559 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-cnibin\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485583 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-etc-kubernetes\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485608 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-system-cni-dir\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-var-lib-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485649 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-ovn\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485668 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j6vk\" (UniqueName: \"kubernetes.io/projected/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-kube-api-access-6j6vk\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485723 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485743 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-bin\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485766 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-system-cni-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485790 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-cni-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a201486-d4f3-4677-adad-4028d94e0623-multus-daemon-config\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485835 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-node-log\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485860 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-systemd-units\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485883 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-multus-certs\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.485942 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frh24\" (UniqueName: \"kubernetes.io/projected/3f9795f2-fd74-48a2-af9c-90e7d47ab178-kube-api-access-frh24\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486001 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-log-socket\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486025 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a201486-d4f3-4677-adad-4028d94e0623-cni-binary-copy\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486066 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-socket-dir-parent\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486134 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-etc-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486167 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovn-node-metrics-cert\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486189 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-netns\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486208 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-hostroot\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486229 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-netns\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486251 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486271 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-script-lib\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486293 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-cni-multus\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486332 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cnibin\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486366 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486387 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-config\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486407 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-k8s-cni-cncf-io\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486446 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxq8f\" (UniqueName: \"kubernetes.io/projected/2a201486-d4f3-4677-adad-4028d94e0623-kube-api-access-wxq8f\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486467 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-conf-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486491 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486514 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-slash\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486569 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-netd\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486595 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-os-release\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486619 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-kubelet\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486641 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-kubelet\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.486743 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-kubelet\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.487464 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-env-overrides\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.487801 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-os-release\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.488381 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cni-binary-copy\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.488559 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-socket-dir-parent\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.488590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-etc-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489298 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a201486-d4f3-4677-adad-4028d94e0623-cni-binary-copy\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489421 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-systemd\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489463 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-cni-bin\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489514 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-cnibin\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489563 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-etc-kubernetes\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489604 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-system-cni-dir\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489642 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-var-lib-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489668 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-config\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489680 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-ovn\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489716 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-netns\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489720 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-ovn-kubernetes\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489764 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-hostroot\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.489790 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-netns\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490085 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-tuning-conf-dir\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490513 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9bc4z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490661 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490733 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-cni-multus\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490769 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cnibin\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490794 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-openvswitch\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490820 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490852 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-k8s-cni-cncf-io\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490891 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3f9795f2-fd74-48a2-af9c-90e7d47ab178-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490704 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-script-lib\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.490981 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-bin\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491039 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-system-cni-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491082 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-run-multus-certs\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491096 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-systemd-units\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491235 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-netd\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491273 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-host-var-lib-kubelet\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491244 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-conf-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491306 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-log-socket\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491318 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-slash\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491350 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-multus-cni-dir\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491359 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-node-log\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.491737 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a201486-d4f3-4677-adad-4028d94e0623-os-release\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.492130 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a201486-d4f3-4677-adad-4028d94e0623-multus-daemon-config\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.496352 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovn-node-metrics-cert\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.573549 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxq8f\" (UniqueName: \"kubernetes.io/projected/2a201486-d4f3-4677-adad-4028d94e0623-kube-api-access-wxq8f\") pod \"multus-wch9m\" (UID: \"2a201486-d4f3-4677-adad-4028d94e0623\") " pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.576963 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j6vk\" (UniqueName: \"kubernetes.io/projected/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-kube-api-access-6j6vk\") pod \"ovnkube-node-wmbt2\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.587494 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frh24\" (UniqueName: \"kubernetes.io/projected/3f9795f2-fd74-48a2-af9c-90e7d47ab178-kube-api-access-frh24\") pod \"multus-additional-cni-plugins-f8vjl\" (UID: \"3f9795f2-fd74-48a2-af9c-90e7d47ab178\") " pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.592592 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.624550 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.650959 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.676082 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.683976 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.692202 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wch9m" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.698098 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.707018 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.725560 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: W1204 15:20:20.733545 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ad0d70_0230_4055_a56e_d83c06c6e0b3.slice/crio-1aa9bf6672ad90ee6ed4581d5a45ad804e1c37d893bd8d72a0c5ef890f5738e2 WatchSource:0}: Error finding container 1aa9bf6672ad90ee6ed4581d5a45ad804e1c37d893bd8d72a0c5ef890f5738e2: Status 404 returned error can't find the container with id 1aa9bf6672ad90ee6ed4581d5a45ad804e1c37d893bd8d72a0c5ef890f5738e2 Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.746626 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.772184 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.792150 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.810975 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.904287 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.933781 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.956700 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.986971 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:20Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.993513 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.993950 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.994509 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.994581 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:20 crc kubenswrapper[4676]: I1204 15:20:20.994623 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.994874 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.995728 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.995792 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.995791 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.995816 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.995812 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.996072 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.996090 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.995971 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:20:24.995943833 +0000 UTC m=+32.430613850 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.996165 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:24.996137968 +0000 UTC m=+32.430807885 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.996186 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:24.996177439 +0000 UTC m=+32.430847506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.996208 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:24.99619626 +0000 UTC m=+32.430866337 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:20 crc kubenswrapper[4676]: E1204 15:20:20.997039 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:24.996994313 +0000 UTC m=+32.431664350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.012960 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.032760 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.285551 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerStarted","Data":"67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.285654 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerStarted","Data":"f2c48772853c3dd509e2ed096ed2b16d8fea5c4ed1e68b0c67787d9bcf57ee96"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.288599 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerStarted","Data":"5ab779bd29e13be099d687ed0cd3c95b005019be83e99fad21323afcf40bd48c"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.290227 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9bc4z" event={"ID":"0eaaf25e-b575-426f-9967-d81ac3c882ee","Type":"ContainerStarted","Data":"26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.290272 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9bc4z" event={"ID":"0eaaf25e-b575-426f-9967-d81ac3c882ee","Type":"ContainerStarted","Data":"fc160b62a381f0210d779d34392e861f445a284b9e93961a41954e70d4f6abb0"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.293428 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247" exitCode=0 Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.293586 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.293637 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"1aa9bf6672ad90ee6ed4581d5a45ad804e1c37d893bd8d72a0c5ef890f5738e2"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.343455 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.343546 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.343564 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"a71999878bbdfffb9330e6e894c450af2e14b5de1054f84b03dc7239bfd25bc5"} Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.393978 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.394334 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.394416 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:21 crc kubenswrapper[4676]: E1204 15:20:21.394882 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:21 crc kubenswrapper[4676]: E1204 15:20:21.395185 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:21 crc kubenswrapper[4676]: E1204 15:20:21.395302 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.435627 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.524541 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.680715 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.707307 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.738291 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.777132 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.791244 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.806643 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.947517 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.970025 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:21 crc kubenswrapper[4676]: I1204 15:20:21.987289 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.001550 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.017336 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.033235 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.050944 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.067120 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.083716 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.100107 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.123127 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.138197 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.151082 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.165418 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.184253 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.197372 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.348613 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.350023 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerStarted","Data":"975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69"} Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.369182 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.539549 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.593433 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.674603 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.873604 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:22 crc kubenswrapper[4676]: I1204 15:20:22.942191 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:22Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.021945 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.049432 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.175480 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.207404 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.225049 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.234788 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.352277 4676 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.383572 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.383662 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.383608 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:23 crc kubenswrapper[4676]: E1204 15:20:23.383792 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:23 crc kubenswrapper[4676]: E1204 15:20:23.383954 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:23 crc kubenswrapper[4676]: E1204 15:20:23.384116 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.445980 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.446041 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.446053 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.446181 4676 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.516305 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.527235 4676 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.527859 4676 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.530002 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.537098 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.537469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.537548 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.537629 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.537776 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:23Z","lastTransitionTime":"2025-12-04T15:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.664406 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.808459 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dgffs"] Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.809133 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.872674 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.872999 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.873127 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.879648 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eba809fc-7400-4863-8e96-baae38c42001-serviceca\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.879734 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba809fc-7400-4863-8e96-baae38c42001-host\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.879936 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmr2q\" (UniqueName: \"kubernetes.io/projected/eba809fc-7400-4863-8e96-baae38c42001-kube-api-access-rmr2q\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.886059 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 15:20:23 crc kubenswrapper[4676]: E1204 15:20:23.906290 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:23 crc kubenswrapper[4676]: I1204 15:20:23.960530 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.008332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.008408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.008421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.008449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.008465 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.023100 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmr2q\" (UniqueName: \"kubernetes.io/projected/eba809fc-7400-4863-8e96-baae38c42001-kube-api-access-rmr2q\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.023223 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eba809fc-7400-4863-8e96-baae38c42001-serviceca\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.023255 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba809fc-7400-4863-8e96-baae38c42001-host\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.023400 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba809fc-7400-4863-8e96-baae38c42001-host\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.024823 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/eba809fc-7400-4863-8e96-baae38c42001-serviceca\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:24 crc kubenswrapper[4676]: E1204 15:20:24.128811 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.162486 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.163140 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmr2q\" (UniqueName: \"kubernetes.io/projected/eba809fc-7400-4863-8e96-baae38c42001-kube-api-access-rmr2q\") pod \"node-ca-dgffs\" (UID: \"eba809fc-7400-4863-8e96-baae38c42001\") " pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.166290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.166340 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.166352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.166375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.166388 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.191059 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.193873 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dgffs" Dec 04 15:20:24 crc kubenswrapper[4676]: E1204 15:20:24.196370 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.203164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.203201 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.203213 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.203230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.203241 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.217329 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: E1204 15:20:24.229989 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.269179 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.269829 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.269869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.269886 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.269983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.270015 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: E1204 15:20:24.295326 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: E1204 15:20:24.295516 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.309716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.309782 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.309795 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.309851 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.309774 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.309865 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.341950 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.359461 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.376232 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.388728 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.411481 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.413010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.413097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.413116 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.413169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.413188 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.435864 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.458334 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.516293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.516531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.516552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.516579 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.516599 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.537351 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.542874 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.544404 4676 generic.go:334] "Generic (PLEG): container finished" podID="3f9795f2-fd74-48a2-af9c-90e7d47ab178" containerID="975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69" exitCode=0 Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.545134 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerDied","Data":"975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69"} Dec 04 15:20:24 crc kubenswrapper[4676]: W1204 15:20:24.599784 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeba809fc_7400_4863_8e96_baae38c42001.slice/crio-575de7aa30f1e3c5d56b5a1201058060d43449139d4f104a4d1381daa3c3eb43 WatchSource:0}: Error finding container 575de7aa30f1e3c5d56b5a1201058060d43449139d4f104a4d1381daa3c3eb43: Status 404 returned error can't find the container with id 575de7aa30f1e3c5d56b5a1201058060d43449139d4f104a4d1381daa3c3eb43 Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.600674 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.623388 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.623830 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.623860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.623871 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.623891 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.623922 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.643067 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.778437 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.787992 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.788040 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.788052 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.788073 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.788087 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.987578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.987622 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.987633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.987651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:24 crc kubenswrapper[4676]: I1204 15:20:24.987661 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:24Z","lastTransitionTime":"2025-12-04T15:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.006496 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.078336 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.078553 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.078593 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.078671 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.078691 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:20:33.078644744 +0000 UTC m=+40.513314741 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.078741 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:33.078728706 +0000 UTC m=+40.513398773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.078810 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.078867 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079054 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079075 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079090 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079165 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:33.079153809 +0000 UTC m=+40.513823676 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079204 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079247 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079264 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079346 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:33.079324864 +0000 UTC m=+40.513994721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079345 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.079530 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:33.079492799 +0000 UTC m=+40.514162866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.112023 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.112100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.112112 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.112132 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.112143 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.151984 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.152831 4676 scope.go:117] "RemoveContainer" containerID="fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58" Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.153031 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.248222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.248265 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.248277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.248295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.248306 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.284273 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.351828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.352256 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.352341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.352421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.352525 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.367254 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.385523 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.385676 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.386044 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.386092 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.386140 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:25 crc kubenswrapper[4676]: E1204 15:20:25.386183 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.455713 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.455761 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.455772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.455791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.455802 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.522189 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.551259 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.552010 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dgffs" event={"ID":"eba809fc-7400-4863-8e96-baae38c42001","Type":"ContainerStarted","Data":"575de7aa30f1e3c5d56b5a1201058060d43449139d4f104a4d1381daa3c3eb43"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.558105 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.558142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.558152 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.558171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.558182 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.637743 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.661121 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.661165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.661179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.661198 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.661211 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.764722 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.764808 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.764834 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.764958 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.764987 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.864740 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.867781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.867834 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.867852 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.867879 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.867893 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.881938 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.904663 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.925615 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.941898 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.965748 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.971019 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.971076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.971089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.971109 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:25 crc kubenswrapper[4676]: I1204 15:20:25.971123 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:25Z","lastTransitionTime":"2025-12-04T15:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.101920 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.101986 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.101997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.102027 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.102041 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.205662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.206187 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.206270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.206368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.206481 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.211730 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.238120 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.250618 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.269288 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.287351 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.310177 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.310610 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.310630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.310638 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.310653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.310663 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.324573 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.338586 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.353096 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.385290 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.413613 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.413666 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.413682 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.413705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.413717 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.420142 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.433490 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.447101 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.469292 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:26Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.519052 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.519639 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.519794 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.519898 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.520061 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.622991 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.623046 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.623060 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.623080 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.623096 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.726369 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.726419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.726453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.726475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.726486 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.830687 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.830733 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.830760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.830778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.830788 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.961930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.962026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.962045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.962077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:26 crc kubenswrapper[4676]: I1204 15:20:26.962125 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:26Z","lastTransitionTime":"2025-12-04T15:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.065660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.065727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.065747 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.065783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.065796 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.205332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.205382 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.205391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.205414 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.205436 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.308929 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.309335 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.309346 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.309363 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.309374 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.386351 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.386457 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:27 crc kubenswrapper[4676]: E1204 15:20:27.386647 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:27 crc kubenswrapper[4676]: E1204 15:20:27.386811 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.386850 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:27 crc kubenswrapper[4676]: E1204 15:20:27.387008 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.417821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.417866 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.417880 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.417915 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.417926 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.523275 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.523675 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.523763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.523846 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.523950 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.565929 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.566402 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.568473 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerStarted","Data":"25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.571597 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dgffs" event={"ID":"eba809fc-7400-4863-8e96-baae38c42001","Type":"ContainerStarted","Data":"0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.583308 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.599071 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.621274 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.635392 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.646559 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.646618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.646635 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.646662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.646676 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.652249 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.681821 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.698113 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.723077 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.788652 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.788712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.788736 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.788771 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.788796 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.802825 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.825432 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.846472 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.868728 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.892545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.892879 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.892981 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.893084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.893173 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.900564 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.996303 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.997513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.997605 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.997681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.997742 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:27Z","lastTransitionTime":"2025-12-04T15:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:27 crc kubenswrapper[4676]: I1204 15:20:27.996658 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:27Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.014525 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.112677 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.128217 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.149692 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.150145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.150238 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.150309 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.150434 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:28Z","lastTransitionTime":"2025-12-04T15:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.149834 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.179174 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.203389 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.226059 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.244873 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.261867 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.278853 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.292180 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.306449 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.325787 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.344624 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.344672 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.344685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.344705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.344715 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:28Z","lastTransitionTime":"2025-12-04T15:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.347592 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.449237 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.449759 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.449841 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.449959 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.450063 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:28Z","lastTransitionTime":"2025-12-04T15:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.553033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.553089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.553099 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.553119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.553134 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:28Z","lastTransitionTime":"2025-12-04T15:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.577734 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.579661 4676 generic.go:334] "Generic (PLEG): container finished" podID="3f9795f2-fd74-48a2-af9c-90e7d47ab178" containerID="25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01" exitCode=0 Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.580101 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerDied","Data":"25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.599727 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.625123 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.770837 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.773823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.773863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.773873 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.773891 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.773922 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:28Z","lastTransitionTime":"2025-12-04T15:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.790522 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.807136 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.835770 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.853452 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.869201 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.876718 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.876765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.876778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.876798 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.876810 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:28Z","lastTransitionTime":"2025-12-04T15:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.884650 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.904615 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.923110 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.943751 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.960768 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.975408 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.980659 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.980930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.981017 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.981096 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.981195 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:28Z","lastTransitionTime":"2025-12-04T15:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:28 crc kubenswrapper[4676]: I1204 15:20:28.991997 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:28Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.007706 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.113456 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.115606 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.115659 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.115674 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.115696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.115715 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.128188 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.140349 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.152333 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.167615 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.178627 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.204389 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.219584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.219728 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.219816 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.219572 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.220204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.220236 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.233742 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.247230 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.265380 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.283373 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.323891 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.323970 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.323983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.324004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.324016 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.384568 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.384666 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.385154 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:29 crc kubenswrapper[4676]: E1204 15:20:29.385346 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:29 crc kubenswrapper[4676]: E1204 15:20:29.385523 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:29 crc kubenswrapper[4676]: E1204 15:20:29.385656 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.427930 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.427972 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.427984 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.428004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.428015 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.531205 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.531260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.531269 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.531289 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.531302 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.587056 4676 generic.go:334] "Generic (PLEG): container finished" podID="3f9795f2-fd74-48a2-af9c-90e7d47ab178" containerID="adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4" exitCode=0 Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.587161 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerDied","Data":"adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.593037 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.606281 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.625952 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.634575 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.634617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.634629 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.634648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.634659 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.641586 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.667535 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.686084 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.702113 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.725529 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.738243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.738290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.738302 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.738321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.738334 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.740088 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.752958 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.772846 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.789327 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.803983 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.822291 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.837703 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:29Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.840962 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.841119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.841212 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.841302 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.841385 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.945033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.945073 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.945082 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.945101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:29 crc kubenswrapper[4676]: I1204 15:20:29.945113 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:29Z","lastTransitionTime":"2025-12-04T15:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.082947 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.083031 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.083044 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.083065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.083078 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.187142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.187183 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.187193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.187210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.187222 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.290310 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.290367 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.290476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.290514 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.290539 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.392943 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.392993 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.393003 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.393020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.393030 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.495854 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.495891 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.495901 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.495938 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.495951 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.598571 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.598614 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.598624 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.598640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.598651 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.600319 4676 generic.go:334] "Generic (PLEG): container finished" podID="3f9795f2-fd74-48a2-af9c-90e7d47ab178" containerID="546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0" exitCode=0 Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.600357 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerDied","Data":"546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.621322 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.640429 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.654940 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.688663 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.704167 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.705014 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.705074 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.705089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.705109 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.705121 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.718407 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.732156 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.748631 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.765475 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.783190 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.800621 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.809539 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.809592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.809604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.809621 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.809632 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.813626 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.828647 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.843650 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:30Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.912250 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.912288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.912298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.912318 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:30 crc kubenswrapper[4676]: I1204 15:20:30.912329 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:30Z","lastTransitionTime":"2025-12-04T15:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.014755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.014804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.014814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.014832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.014843 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.118096 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.118162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.118173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.118193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.118207 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.221134 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.221196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.221210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.221232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.221246 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.325477 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.325525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.325543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.325570 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.325583 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.384092 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.384274 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:31 crc kubenswrapper[4676]: E1204 15:20:31.384477 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.384540 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:31 crc kubenswrapper[4676]: E1204 15:20:31.384573 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:31 crc kubenswrapper[4676]: E1204 15:20:31.384758 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.428876 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.428948 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.428959 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.428979 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.428990 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.532367 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.532403 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.532411 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.532428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.532438 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.615547 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.616138 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.623492 4676 generic.go:334] "Generic (PLEG): container finished" podID="3f9795f2-fd74-48a2-af9c-90e7d47ab178" containerID="c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415" exitCode=0 Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.623547 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerDied","Data":"c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.636838 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.636925 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.636943 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.636969 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.636982 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.639155 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.703971 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.705548 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.720084 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.738719 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.739562 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.739588 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.739597 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.739614 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.739624 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.757135 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.773229 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.788498 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.802340 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.823672 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.845034 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.845119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.845137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.845166 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.845179 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.849375 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.878170 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.904625 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.930367 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.965027 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.965069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.965089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.965118 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.965149 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:31Z","lastTransitionTime":"2025-12-04T15:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:31 crc kubenswrapper[4676]: I1204 15:20:31.980611 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.000744 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:31Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.019866 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.039246 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.061505 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.068679 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.068744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.068757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.068784 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.068797 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.092475 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.128731 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.153481 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.172439 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.172519 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.172543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.172591 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.172609 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.198842 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.223711 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.243265 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.265046 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.275455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.275527 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.275547 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.275581 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.275601 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.290501 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.311742 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.335278 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.378522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.378584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.378600 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.378629 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.378648 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.482077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.482128 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.482140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.482159 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.482178 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.584671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.584714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.584727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.584747 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.584760 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.631110 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerStarted","Data":"99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.631235 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.632446 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.656942 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.660117 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.677251 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.688685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.688734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.688753 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.688778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.688794 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.692395 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.709711 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.725833 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.741458 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.754662 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.768675 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.786751 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.792138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.792199 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.792214 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.792237 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.792249 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.806640 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.824631 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.839205 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.855835 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.873585 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.891268 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.895201 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.895239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.895252 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.895271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.895285 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.906621 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.921940 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.941335 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.959667 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.982764 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.999176 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.999247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.999262 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.999287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.999301 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:32Z","lastTransitionTime":"2025-12-04T15:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:32 crc kubenswrapper[4676]: I1204 15:20:32.999332 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:32Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.012868 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.029098 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.046234 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.063646 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.078980 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.093952 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.102548 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.102636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.102651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.102674 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.102693 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.109245 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.109417 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.109457 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.109486 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.109536 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109627 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:20:49.109571053 +0000 UTC m=+56.544240950 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109705 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109721 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109835 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:49.10981045 +0000 UTC m=+56.544480497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109740 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109838 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109735 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109937 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109953 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.110015 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:49.109976644 +0000 UTC m=+56.544646711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.109875 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.110043 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:49.110031596 +0000 UTC m=+56.544701683 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.110066 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:49.110057817 +0000 UTC m=+56.544727904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.126579 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.205328 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.205379 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.205391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.205413 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.205426 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.309075 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.309128 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.309140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.309158 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.309168 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.384239 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.384334 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.384239 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.384467 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.384615 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:33 crc kubenswrapper[4676]: E1204 15:20:33.384721 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.403351 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.412173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.412225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.412238 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.412259 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.412274 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.422648 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.438374 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.457539 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.474576 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.488861 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.504082 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.514163 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.514199 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.514210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.514228 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.514239 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.517419 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.528504 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.543427 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.559172 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.573337 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.586734 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.604455 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.617043 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.617571 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.617601 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.617630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.617658 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.621538 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.641075 4676 generic.go:334] "Generic (PLEG): container finished" podID="3f9795f2-fd74-48a2-af9c-90e7d47ab178" containerID="99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680" exitCode=0 Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.641165 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerDied","Data":"99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.658081 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.673351 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.690162 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.711173 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.721106 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.721155 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.721166 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.721185 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.721202 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.725066 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.737981 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.751434 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.769200 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.785997 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.801167 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.817571 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.824769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.824821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.824834 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.824859 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.824872 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.830794 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.845059 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.859854 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:33Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.928188 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.928234 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.928246 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.928266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:33 crc kubenswrapper[4676]: I1204 15:20:33.928278 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:33Z","lastTransitionTime":"2025-12-04T15:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.031705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.032056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.032153 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.032309 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.032420 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.136128 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.136172 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.136184 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.136203 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.136217 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.238714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.238758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.238801 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.238819 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.238833 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.342186 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.342247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.342261 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.342280 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.342290 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.412952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.412999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.413009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.413027 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.413037 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: E1204 15:20:34.430025 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.436130 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.436227 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.436243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.436271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.436289 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: E1204 15:20:34.517559 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.522643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.522705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.522725 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.522748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.522764 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: E1204 15:20:34.540323 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.548316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.548632 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.548706 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.548770 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.548950 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: E1204 15:20:34.564020 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.568696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.568868 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.569014 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.569126 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.569213 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: E1204 15:20:34.582756 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: E1204 15:20:34.583046 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.585281 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.585307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.585316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.585333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.585345 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.657192 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" event={"ID":"3f9795f2-fd74-48a2-af9c-90e7d47ab178","Type":"ContainerStarted","Data":"7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.682530 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.688199 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.688481 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.688559 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.688633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.688747 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.701511 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.724172 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.740200 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.754709 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.770196 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.784670 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.797370 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.806395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.806440 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.806452 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.806469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.806489 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.818963 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.832386 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.847817 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.866563 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.893336 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.910404 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.910447 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.910457 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.910475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.910486 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:34Z","lastTransitionTime":"2025-12-04T15:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:34 crc kubenswrapper[4676]: I1204 15:20:34.911493 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:34Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.013370 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.013809 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.013824 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.013844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.013859 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:35Z","lastTransitionTime":"2025-12-04T15:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.157765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.157827 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.157838 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.157862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.157880 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:35Z","lastTransitionTime":"2025-12-04T15:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.260692 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.260727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.260735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.260751 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.260760 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:35Z","lastTransitionTime":"2025-12-04T15:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.288843 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd"] Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.289622 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.292372 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.292809 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.299657 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2362781-61ed-4bed-b752-d89d5808d9fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.299792 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2362781-61ed-4bed-b752-d89d5808d9fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.299826 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgtzt\" (UniqueName: \"kubernetes.io/projected/c2362781-61ed-4bed-b752-d89d5808d9fe-kube-api-access-jgtzt\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.299857 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2362781-61ed-4bed-b752-d89d5808d9fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.305612 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.320281 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.334491 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.356424 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.374282 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.377382 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.377681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.377785 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.377893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.378068 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:35Z","lastTransitionTime":"2025-12-04T15:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.384025 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:35 crc kubenswrapper[4676]: E1204 15:20:35.384168 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.384526 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.384565 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:35 crc kubenswrapper[4676]: E1204 15:20:35.384616 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:35 crc kubenswrapper[4676]: E1204 15:20:35.384667 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.395030 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.400484 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2362781-61ed-4bed-b752-d89d5808d9fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.400570 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2362781-61ed-4bed-b752-d89d5808d9fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.400610 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgtzt\" (UniqueName: \"kubernetes.io/projected/c2362781-61ed-4bed-b752-d89d5808d9fe-kube-api-access-jgtzt\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.400646 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2362781-61ed-4bed-b752-d89d5808d9fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.401529 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2362781-61ed-4bed-b752-d89d5808d9fe-env-overrides\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.401980 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2362781-61ed-4bed-b752-d89d5808d9fe-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.407456 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2362781-61ed-4bed-b752-d89d5808d9fe-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.410319 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.419660 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgtzt\" (UniqueName: \"kubernetes.io/projected/c2362781-61ed-4bed-b752-d89d5808d9fe-kube-api-access-jgtzt\") pod \"ovnkube-control-plane-749d76644c-wldgd\" (UID: \"c2362781-61ed-4bed-b752-d89d5808d9fe\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.428044 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.444377 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.462811 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.478282 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.480753 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.480828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.480844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.480872 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.480890 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:35Z","lastTransitionTime":"2025-12-04T15:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.491375 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.507463 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.521410 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.535752 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:35Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.584569 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.584626 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.584639 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.584662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.584678 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:35Z","lastTransitionTime":"2025-12-04T15:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.604766 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.662570 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" event={"ID":"c2362781-61ed-4bed-b752-d89d5808d9fe","Type":"ContainerStarted","Data":"1855eaf75326a8c896cff0e4ee67abdb679c900cca502a56c120f52e6f7e3705"} Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.912401 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.912572 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.912592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.912642 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:35 crc kubenswrapper[4676]: I1204 15:20:35.912662 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:35Z","lastTransitionTime":"2025-12-04T15:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.015577 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.015626 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.015640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.015659 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.015671 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.119649 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.119694 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.119705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.119725 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.119745 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.224524 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.224580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.224589 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.224609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.224619 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.371653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.371718 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.371732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.371781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.371799 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.387301 4676 scope.go:117] "RemoveContainer" containerID="fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.399577 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-nsvkq"] Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.400612 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:36 crc kubenswrapper[4676]: E1204 15:20:36.400688 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.417312 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.433540 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.445627 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.466316 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.470637 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.470689 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xw6r\" (UniqueName: \"kubernetes.io/projected/711742b9-8c03-4234-ae1d-4d7d3baa4217-kube-api-access-6xw6r\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.475508 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.475540 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.475550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.475571 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.475586 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.481789 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.496247 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.508489 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.528747 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.546083 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.567151 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.571129 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.571234 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xw6r\" (UniqueName: \"kubernetes.io/projected/711742b9-8c03-4234-ae1d-4d7d3baa4217-kube-api-access-6xw6r\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:36 crc kubenswrapper[4676]: E1204 15:20:36.571727 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:36 crc kubenswrapper[4676]: E1204 15:20:36.571842 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:37.071812493 +0000 UTC m=+44.506482350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.578466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.578853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.578864 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.578885 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.578926 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.584329 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.592600 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xw6r\" (UniqueName: \"kubernetes.io/projected/711742b9-8c03-4234-ae1d-4d7d3baa4217-kube-api-access-6xw6r\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.597540 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.611377 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.624426 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.636140 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.649355 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.670139 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/0.log" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.674293 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327" exitCode=1 Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.674361 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.675350 4676 scope.go:117] "RemoveContainer" containerID="e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.676949 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" event={"ID":"c2362781-61ed-4bed-b752-d89d5808d9fe","Type":"ContainerStarted","Data":"56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.677020 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" event={"ID":"c2362781-61ed-4bed-b752-d89d5808d9fe","Type":"ContainerStarted","Data":"a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.681640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.681683 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.681693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.681714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.681728 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.693706 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.712445 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.726478 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.743297 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.769149 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.784704 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.784765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.784779 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.784806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.784821 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.792881 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.806000 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.817376 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.830417 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.843545 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.856967 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.870159 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.884871 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.887187 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.887225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.887237 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.887254 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.887265 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.898430 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.911142 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.928430 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.946413 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.961714 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.978270 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.990765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.991022 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.991148 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.991340 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.991471 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:36Z","lastTransitionTime":"2025-12-04T15:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:36 crc kubenswrapper[4676]: I1204 15:20:36.999630 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:36Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.020397 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.035068 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.050510 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.068016 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.074636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:37 crc kubenswrapper[4676]: E1204 15:20:37.074828 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:37 crc kubenswrapper[4676]: E1204 15:20:37.074922 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:38.074888637 +0000 UTC m=+45.509558494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.082716 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.094693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.094993 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.095112 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.095190 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.095252 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:37Z","lastTransitionTime":"2025-12-04T15:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.100582 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.122426 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.138245 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.164865 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.187174 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.199761 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.199830 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.199840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.199881 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.199895 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:37Z","lastTransitionTime":"2025-12-04T15:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.204490 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.220760 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.303004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.303058 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.303075 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.303095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.303109 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:37Z","lastTransitionTime":"2025-12-04T15:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.384520 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.384565 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:37 crc kubenswrapper[4676]: E1204 15:20:37.384724 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.384773 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:37 crc kubenswrapper[4676]: E1204 15:20:37.384943 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:37 crc kubenswrapper[4676]: E1204 15:20:37.385052 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.615335 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.615392 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.615405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.615438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.615453 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:37Z","lastTransitionTime":"2025-12-04T15:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.684525 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.686719 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.687566 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.689897 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/0.log" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.694134 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.695206 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.703603 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.726929 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.726984 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.726999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.727023 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.727036 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:37Z","lastTransitionTime":"2025-12-04T15:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.740212 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.766734 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.810124 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.829756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.829792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.829801 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.829816 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.829826 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:37Z","lastTransitionTime":"2025-12-04T15:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.834090 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.933182 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.933230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.933242 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.933260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.933272 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:37Z","lastTransitionTime":"2025-12-04T15:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:37 crc kubenswrapper[4676]: I1204 15:20:37.985772 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:37Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.007521 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.020387 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.035696 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.036175 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.036403 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.036435 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.036466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.036484 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.048889 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.077440 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.094523 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.188444 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:38 crc kubenswrapper[4676]: E1204 15:20:38.188641 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:38 crc kubenswrapper[4676]: E1204 15:20:38.188741 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:40.188699137 +0000 UTC m=+47.623368994 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.191206 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.191414 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.191521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.191617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.191706 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.220484 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.233984 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.249971 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.266402 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.281784 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.294293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.294346 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.294358 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.294379 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.294392 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.296986 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.310548 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.335144 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.350278 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.364132 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.374516 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.384065 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:38 crc kubenswrapper[4676]: E1204 15:20:38.384257 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.389576 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.397138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.397179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.397189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.397208 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.397221 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.403879 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.418603 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.431656 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.540783 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.541569 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.541607 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.541619 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.541637 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.541647 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.559625 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.575267 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.589969 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.601690 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:38Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.644037 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.644076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.644095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.644120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.644131 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.748116 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.748181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.748196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.748215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.748226 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.854101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.854151 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.854165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.854185 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.854200 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.957228 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.957284 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.957297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.957325 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:38 crc kubenswrapper[4676]: I1204 15:20:38.957345 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:38Z","lastTransitionTime":"2025-12-04T15:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.059615 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.059748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.059784 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.059803 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.059814 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.162744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.162787 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.162797 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.162813 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.162824 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.266189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.266264 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.266273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.266290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.266299 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.370656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.370745 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.370770 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.370810 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.370838 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.384179 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.384278 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:39 crc kubenswrapper[4676]: E1204 15:20:39.384481 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.384506 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:39 crc kubenswrapper[4676]: E1204 15:20:39.384694 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:39 crc kubenswrapper[4676]: E1204 15:20:39.385034 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.474233 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.474292 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.474306 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.474326 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.474341 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.577001 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.577045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.577056 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.577076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.577088 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.680200 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.680257 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.680269 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.680288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.680300 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.783054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.783100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.783130 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.783147 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.783158 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.784365 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/1.log" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.785081 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/0.log" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.788285 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149" exitCode=1 Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.788330 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.788436 4676 scope.go:117] "RemoveContainer" containerID="e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.789427 4676 scope.go:117] "RemoveContainer" containerID="25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149" Dec 04 15:20:39 crc kubenswrapper[4676]: E1204 15:20:39.789698 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.806300 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.823329 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.836670 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.850365 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.865757 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.877154 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.886765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.886805 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.886815 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.886833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.886850 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:39Z","lastTransitionTime":"2025-12-04T15:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.890952 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.904292 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:39 crc kubenswrapper[4676]: I1204 15:20:39.917660 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:39Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.090386 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.090440 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.090455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.090476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.090492 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.097824 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:40Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.134456 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:40Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.154431 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:40Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.177572 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:40Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.194270 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:40Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.196502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.196558 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.196576 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.196609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.196627 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.218824 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:40Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.241410 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:40Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.287681 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:40 crc kubenswrapper[4676]: E1204 15:20:40.288009 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:40 crc kubenswrapper[4676]: E1204 15:20:40.288190 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:44.288137159 +0000 UTC m=+51.722807016 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.301572 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.301633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.301658 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.301678 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.301694 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.384434 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:40 crc kubenswrapper[4676]: E1204 15:20:40.384781 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.407173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.407331 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.407349 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.407375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.407387 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.511993 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.513018 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.513067 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.513101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.513125 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.616231 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.616290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.616308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.616349 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.616372 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.720015 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.720055 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.720064 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.720080 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.720090 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.796897 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/1.log" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.823666 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.823748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.823770 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.823804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.823829 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.926585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.926637 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.926650 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.926665 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:40 crc kubenswrapper[4676]: I1204 15:20:40.926677 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:40Z","lastTransitionTime":"2025-12-04T15:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.030595 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.030660 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.030682 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.030710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.030730 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.134665 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.134739 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.134770 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.134839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.134867 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.237395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.237442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.237454 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.237475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.237491 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.341296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.341372 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.341393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.341426 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.341453 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.384271 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.384308 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.384285 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:41 crc kubenswrapper[4676]: E1204 15:20:41.384470 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:41 crc kubenswrapper[4676]: E1204 15:20:41.384609 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:41 crc kubenswrapper[4676]: E1204 15:20:41.385091 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.443978 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.444036 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.444045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.444060 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.444069 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.547298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.547377 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.547416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.547442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.547573 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.655071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.655569 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.655693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.655808 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.655916 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.758214 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.758270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.758282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.758307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.758321 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.862110 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.862162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.862171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.862192 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.862203 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.966045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.966089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.966101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.966141 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:41 crc kubenswrapper[4676]: I1204 15:20:41.966155 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:41Z","lastTransitionTime":"2025-12-04T15:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.069324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.069397 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.069413 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.069443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.069458 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.172443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.172494 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.172507 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.172522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.172532 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.275352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.275406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.275418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.275435 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.275444 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.378102 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.378135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.378145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.378161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.378173 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.383189 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:42 crc kubenswrapper[4676]: E1204 15:20:42.383304 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.481431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.481506 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.481518 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.481546 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.481562 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.584150 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.584205 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.584217 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.584236 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.584249 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.640372 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.653960 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.658872 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.673526 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.686950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.687007 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.687020 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.687045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.687062 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.688362 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.700944 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.716554 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.735469 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.752132 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.774724 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.790528 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.791320 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.791474 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.791499 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.791520 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.791532 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.916313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.916392 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.916407 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.916430 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.916443 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:42Z","lastTransitionTime":"2025-12-04T15:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.931719 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.944316 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.959823 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.976031 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:42 crc kubenswrapper[4676]: I1204 15:20:42.991877 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:42Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.006505 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.019716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.019762 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.019771 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.019791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.019802 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.023405 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.123309 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.123364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.123378 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.123406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.123421 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.226411 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.226453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.226463 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.226482 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.226491 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.329252 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.329312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.329339 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.329376 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.329399 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.384233 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.384233 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.384259 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:43 crc kubenswrapper[4676]: E1204 15:20:43.384520 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:43 crc kubenswrapper[4676]: E1204 15:20:43.384567 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:43 crc kubenswrapper[4676]: E1204 15:20:43.384700 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.411711 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.429632 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.431857 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.431918 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.431933 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.431953 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.431965 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.442145 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.455542 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.470935 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.484234 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.497966 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.509980 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.526878 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.533772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.533801 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.533810 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.533826 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.533836 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.543617 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.555453 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.577004 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.591675 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.606768 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.618803 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.637009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.637052 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.637062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.637079 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.637091 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.638431 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.651930 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:43Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.739567 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.739613 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.739630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.739651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.739663 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.843351 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.843400 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.843413 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.843434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.843447 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.946650 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.946694 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.946705 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.946723 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:43 crc kubenswrapper[4676]: I1204 15:20:43.946737 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:43Z","lastTransitionTime":"2025-12-04T15:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.049604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.049736 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.049748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.049769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.049782 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.153065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.153123 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.153136 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.153162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.153175 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.256415 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.256458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.256470 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.256487 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.256497 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.326628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.326850 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.326957 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:20:52.326936576 +0000 UTC m=+59.761606433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.359499 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.359545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.359559 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.359582 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.359592 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.384141 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.384463 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.466606 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.466658 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.466672 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.466703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.466722 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.569386 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.569775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.569869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.570004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.570127 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.646333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.646375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.646384 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.646403 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.646415 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.661630 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:44Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.666347 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.666399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.666418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.666442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.666456 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.680642 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:44Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.685675 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.685839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.685961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.686071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.686175 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.698980 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:44Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.703495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.703544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.703557 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.703574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.703586 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.716091 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:44Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.719898 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.719950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.719961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.719977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.719988 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.731037 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:44Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:44 crc kubenswrapper[4676]: E1204 15:20:44.731356 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.733106 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.733148 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.733158 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.733176 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.733187 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.836094 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.836160 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.836173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.836194 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.836304 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.940072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.940155 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.940169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.940206 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:44 crc kubenswrapper[4676]: I1204 15:20:44.940280 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:44Z","lastTransitionTime":"2025-12-04T15:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.043484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.043555 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.043568 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.043587 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.043599 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.146634 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.146684 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.146696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.146717 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.146727 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.250682 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.250757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.250769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.250788 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.250801 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.354630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.354671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.354680 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.354698 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.354709 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.384064 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.384166 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:45 crc kubenswrapper[4676]: E1204 15:20:45.384274 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.384170 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:45 crc kubenswrapper[4676]: E1204 15:20:45.384358 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:45 crc kubenswrapper[4676]: E1204 15:20:45.384422 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.457787 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.457828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.457839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.457855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.457864 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.561250 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.561304 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.561313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.561333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.561349 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.664656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.664727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.664743 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.664772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.664793 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.769754 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.769818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.769830 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.769852 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.769865 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.873357 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.873404 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.873416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.873432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.873443 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.976193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.976256 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.976272 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.976291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:45 crc kubenswrapper[4676]: I1204 15:20:45.976302 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:45Z","lastTransitionTime":"2025-12-04T15:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.079663 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.079706 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.079715 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.079732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.079743 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.182392 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.182454 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.182466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.182481 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.182493 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.286134 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.286200 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.286210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.286230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.286241 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.384093 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:46 crc kubenswrapper[4676]: E1204 15:20:46.384286 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.389062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.389224 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.389592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.389626 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.389637 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.494808 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.494853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.494897 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.494932 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.494944 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.597357 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.597406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.597418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.597434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.597444 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.700950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.701002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.701015 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.701033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.701044 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.804147 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.804191 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.804199 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.804215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.804225 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.906828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.906867 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.906879 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.906895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:46 crc kubenswrapper[4676]: I1204 15:20:46.906921 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:46Z","lastTransitionTime":"2025-12-04T15:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.009522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.009560 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.009570 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.009585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.009594 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.112216 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.112261 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.112279 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.112304 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.112318 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.215652 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.215703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.215718 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.215738 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.215751 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.319168 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.319213 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.319224 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.319243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.319254 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.387771 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:47 crc kubenswrapper[4676]: E1204 15:20:47.387981 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.388079 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.581495 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:47 crc kubenswrapper[4676]: E1204 15:20:47.581686 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:47 crc kubenswrapper[4676]: E1204 15:20:47.388148 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.594866 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.594943 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.594956 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.594978 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.594990 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.697959 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.698014 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.698023 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.698039 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.698049 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.801301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.801983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.802247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.802436 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.802620 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.906459 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.906511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.906521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.906542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:47 crc kubenswrapper[4676]: I1204 15:20:47.906564 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:47Z","lastTransitionTime":"2025-12-04T15:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.010241 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.010302 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.010315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.010335 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.010348 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.113293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.113360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.113377 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.113398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.113410 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.216352 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.216396 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.216405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.216421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.216440 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.319563 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.319614 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.319627 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.319645 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.319658 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.384161 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:48 crc kubenswrapper[4676]: E1204 15:20:48.384413 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.422598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.422661 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.422679 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.422707 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.422726 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.526516 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.526584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.526596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.526615 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.526627 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.629429 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.629476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.629486 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.629502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.629511 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.732875 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.732957 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.732969 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.732989 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.733001 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.836227 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.836298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.836322 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.836360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.836398 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.938815 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.939171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.939225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.939244 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:48 crc kubenswrapper[4676]: I1204 15:20:48.939253 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:48Z","lastTransitionTime":"2025-12-04T15:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.042316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.042373 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.042388 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.042413 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.042432 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.144944 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.145001 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.145011 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.145029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.145039 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.197682 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.198013 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:21:21.197958139 +0000 UTC m=+88.632627996 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.198415 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.198573 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.198702 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.198854 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.198649 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199151 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199286 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.198798 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.198810 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199046 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199724 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199497 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:21:21.199475873 +0000 UTC m=+88.634145730 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199742 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199794 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:21:21.199749761 +0000 UTC m=+88.634419618 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199826 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:21:21.199817103 +0000 UTC m=+88.634486960 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.199849 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:21:21.199842654 +0000 UTC m=+88.634512511 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.247984 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.248048 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.248061 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.248082 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.248093 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.351303 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.351364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.351375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.351443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.351456 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.383825 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.383852 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.384112 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.383942 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.384197 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:49 crc kubenswrapper[4676]: E1204 15:20:49.384232 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.454550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.454633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.454645 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.454665 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.454676 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.557471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.557518 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.557530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.557557 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.557571 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.660045 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.660088 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.660098 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.660113 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.660124 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.764067 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.764508 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.764633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.764775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.764885 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.867877 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.868565 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.868636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.868779 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.868848 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.972026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.972100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.972112 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.972132 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:49 crc kubenswrapper[4676]: I1204 15:20:49.972143 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:49Z","lastTransitionTime":"2025-12-04T15:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.075491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.075830 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.075896 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.076113 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.076200 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.178998 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.179058 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.179072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.179096 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.179113 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.281882 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.281974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.281988 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.282011 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.282026 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.383500 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:50 crc kubenswrapper[4676]: E1204 15:20:50.383693 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.385835 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.385893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.385932 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.385956 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.385968 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.488874 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.488941 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.488952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.488975 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.488985 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.591845 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.591894 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.591934 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.591961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.591972 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.694441 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.694490 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.694498 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.694523 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.694541 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.797334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.797406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.797424 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.797455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.797471 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.900543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.900636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.900653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.900675 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:50 crc kubenswrapper[4676]: I1204 15:20:50.900689 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:50Z","lastTransitionTime":"2025-12-04T15:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.003360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.003400 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.003412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.003431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.003442 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.105999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.106046 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.106059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.106074 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.106085 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.209162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.209215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.209226 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.209244 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.209258 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.312422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.312478 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.312490 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.312519 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.312577 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.384375 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.384375 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:51 crc kubenswrapper[4676]: E1204 15:20:51.384605 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.384400 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:51 crc kubenswrapper[4676]: E1204 15:20:51.384850 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:51 crc kubenswrapper[4676]: E1204 15:20:51.384883 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.416379 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.416431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.416442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.416460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.416472 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.519405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.519460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.519472 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.519493 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.519506 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.621991 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.622059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.622071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.622094 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.622106 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.724436 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.724511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.724530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.724548 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.724559 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.827645 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.827695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.827708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.827728 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.827742 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.930876 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.930965 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.930976 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.930992 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:51 crc kubenswrapper[4676]: I1204 15:20:51.931003 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:51Z","lastTransitionTime":"2025-12-04T15:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.033813 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.033865 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.033877 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.033895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.033926 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.137248 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.137673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.137753 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.137857 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.137999 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.226085 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.240924 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.240977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.240989 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.241012 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.241109 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.242580 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.261547 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.277892 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.290412 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.302832 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.318961 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.333705 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.344009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.344064 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.344078 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.344099 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.344113 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.348667 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.367126 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:52 crc kubenswrapper[4676]: E1204 15:20:52.367316 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:52 crc kubenswrapper[4676]: E1204 15:20:52.367404 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:21:08.367383519 +0000 UTC m=+75.802053376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.370259 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.384005 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:52 crc kubenswrapper[4676]: E1204 15:20:52.384163 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.385066 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.401627 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.414944 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.431988 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.446535 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.447103 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.447150 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.447160 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.447184 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.447198 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.461372 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.474499 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.486443 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:52Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.549955 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.550000 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.550013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.550029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.550038 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.653232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.653287 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.653303 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.653324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.653339 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.756240 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.756297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.756316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.756341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.756351 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.858839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.858895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.858928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.858950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.858963 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.962486 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.962567 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.962585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.962617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:52 crc kubenswrapper[4676]: I1204 15:20:52.962631 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:52Z","lastTransitionTime":"2025-12-04T15:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.065247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.065304 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.065317 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.065339 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.065357 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.167750 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.167818 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.167828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.167848 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.167860 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.271141 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.271206 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.271219 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.271241 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.271252 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.374258 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.374300 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.374318 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.374336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.374350 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.383703 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:53 crc kubenswrapper[4676]: E1204 15:20:53.383841 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.384122 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:53 crc kubenswrapper[4676]: E1204 15:20:53.384339 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.383703 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:53 crc kubenswrapper[4676]: E1204 15:20:53.384646 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.405535 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.419517 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.430856 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.447996 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.465252 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.477137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.477189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.477203 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.477225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.477237 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.483545 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.501292 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.515101 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.533679 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.551719 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.567105 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.579799 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.579844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.579854 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.579874 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.579888 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.583114 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.596354 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.613499 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.628575 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.643462 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.667806 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5cec5c13704ede90e092a6825977590a61275d083d329f04301a87482140327\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"r.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.109880 5914 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:20:36.110142 5914 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110339 5914 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110479 5914 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.110641 5914 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:20:36.111436 5914 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:36.111466 5914 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:36.111472 5914 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:36.111554 5914 factory.go:656] Stopping watch factory\\\\nI1204 15:20:36.111604 5914 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:36.111884 5914 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:53Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.682805 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.682858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.682873 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.682895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.682948 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.785448 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.785513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.785524 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.785545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.785562 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.888356 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.888414 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.888431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.888453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.888468 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.991659 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.991712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.991725 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.991743 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:53 crc kubenswrapper[4676]: I1204 15:20:53.991757 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:53Z","lastTransitionTime":"2025-12-04T15:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.095100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.095154 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.095185 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.095204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.095214 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.198254 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.198302 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.198316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.198333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.198347 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.302263 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.302333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.302345 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.302364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.302377 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.384251 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:54 crc kubenswrapper[4676]: E1204 15:20:54.384600 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.384943 4676 scope.go:117] "RemoveContainer" containerID="25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.406332 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.407189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.407243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.407255 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.407275 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.407286 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.426529 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.442646 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.461529 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.475090 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.489225 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.604458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.604494 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.604503 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.604520 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.604532 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.627885 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.645056 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.665259 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.682357 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.699397 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.707727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.707781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.707796 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.707819 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.707834 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.718189 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.737119 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.753872 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.779121 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.805119 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.810988 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.811039 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.811052 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.811076 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.811091 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.822241 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.828344 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.828398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.828412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.828433 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.828446 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: E1204 15:20:54.845414 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.850541 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.850592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.850608 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.850627 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.850639 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: E1204 15:20:54.877212 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.884337 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.884392 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.884405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.884425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.884439 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: E1204 15:20:54.900477 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.905838 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.905881 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.905893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.905924 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.905936 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: E1204 15:20:54.922001 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.927092 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.927148 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.927161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.927180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.927194 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: E1204 15:20:54.943539 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:54 crc kubenswrapper[4676]: E1204 15:20:54.943664 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.945583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.945622 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.945632 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.945653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.945664 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:54Z","lastTransitionTime":"2025-12-04T15:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.965205 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/1.log" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.970227 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5"} Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.970790 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:20:54 crc kubenswrapper[4676]: I1204 15:20:54.987216 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:54Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.009666 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.030064 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.047964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.048017 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.048031 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.048068 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.048083 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.050262 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.066082 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.086945 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.114607 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.143190 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.151502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.151563 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.151577 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.151603 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.151620 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.160987 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.255215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.255272 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.255283 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.255355 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.255382 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.358971 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.359040 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.359061 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.359096 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.359118 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.449063 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.449131 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:55 crc kubenswrapper[4676]: E1204 15:20:55.449227 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:55 crc kubenswrapper[4676]: E1204 15:20:55.449315 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.449071 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:55 crc kubenswrapper[4676]: E1204 15:20:55.449421 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.458645 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.462257 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.462288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.462298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.462315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.462348 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.480728 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.495616 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.514228 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.533014 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.550415 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.566271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.566330 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.566345 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.566364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.566383 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.566746 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.581238 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.669956 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.669994 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.670005 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.670065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.670079 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.774361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.774413 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.774425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.774447 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.774468 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.877544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.877609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.877622 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.877642 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.877653 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.980620 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.980673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.980688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.980707 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:55 crc kubenswrapper[4676]: I1204 15:20:55.980723 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:55Z","lastTransitionTime":"2025-12-04T15:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.083784 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.083823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.083833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.083848 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.083858 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.186987 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.187037 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.187050 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.187070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.187082 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.289863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.289924 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.289972 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.289993 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.290007 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.383647 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:56 crc kubenswrapper[4676]: E1204 15:20:56.384031 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.392820 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.392868 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.392882 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.392900 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.393001 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.496013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.496471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.496489 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.496508 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.496520 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.599794 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.599874 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.599885 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.599925 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.599938 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.702719 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.702841 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.702855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.702878 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.702892 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.806470 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.806520 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.806532 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.806551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.806563 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.909730 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.909786 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.909800 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.910049 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.910064 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:56Z","lastTransitionTime":"2025-12-04T15:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.978790 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/2.log" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.979430 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/1.log" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.983436 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5" exitCode=1 Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.983494 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5"} Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.983551 4676 scope.go:117] "RemoveContainer" containerID="25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149" Dec 04 15:20:56 crc kubenswrapper[4676]: I1204 15:20:56.984382 4676 scope.go:117] "RemoveContainer" containerID="ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5" Dec 04 15:20:56 crc kubenswrapper[4676]: E1204 15:20:56.984575 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.002643 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:56Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.012200 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.012241 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.012254 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.012276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.012294 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.018211 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.035304 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.052121 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.068518 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.083811 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.099353 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.114821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.115144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.115224 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.115321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.115396 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.115593 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.132565 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.148519 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.173233 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.188396 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.203977 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.218054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.218112 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.218126 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.218193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.218219 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.224157 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.241442 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.259618 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.276780 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:57Z is after 2025-08-24T17:21:41Z" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.321170 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.321499 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.321566 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.321677 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.321821 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.384153 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.384170 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:57 crc kubenswrapper[4676]: E1204 15:20:57.384341 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:57 crc kubenswrapper[4676]: E1204 15:20:57.384442 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.384202 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:57 crc kubenswrapper[4676]: E1204 15:20:57.384529 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.424473 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.424525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.424537 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.424558 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.424572 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.528277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.528354 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.528372 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.528390 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.528403 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.631477 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.631542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.631554 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.631573 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.631583 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.735225 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.735266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.735296 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.735315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.735328 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.838518 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.838568 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.838578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.838598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.838610 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.942266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.942323 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.942338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.942359 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.942383 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:57Z","lastTransitionTime":"2025-12-04T15:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:57 crc kubenswrapper[4676]: I1204 15:20:57.989181 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/2.log" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.045881 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.045961 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.045974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.045992 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.046007 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.148998 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.149080 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.149100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.149122 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.149136 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.252542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.252603 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.252618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.252638 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.252652 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.355648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.355692 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.355701 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.355717 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.355727 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.383297 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:20:58 crc kubenswrapper[4676]: E1204 15:20:58.383462 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.458070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.458110 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.458122 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.458139 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.458148 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.561765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.561862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.561877 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.561895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.561931 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.664265 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.664334 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.664347 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.664383 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.664396 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.768053 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.768115 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.768131 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.768157 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.768172 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.871572 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.871623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.871636 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.871656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.871671 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.974109 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.974145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.974157 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.974173 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:58 crc kubenswrapper[4676]: I1204 15:20:58.974183 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:58Z","lastTransitionTime":"2025-12-04T15:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.077640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.077696 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.077710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.077729 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.077743 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.180645 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.180682 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.180735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.180757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.180769 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.284137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.284476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.284574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.284707 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.284779 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.384018 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.384048 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:20:59 crc kubenswrapper[4676]: E1204 15:20:59.384188 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:20:59 crc kubenswrapper[4676]: E1204 15:20:59.384353 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.384651 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:20:59 crc kubenswrapper[4676]: E1204 15:20:59.384749 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.387667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.387699 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.387709 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.387727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.387737 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.490734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.490775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.490785 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.490803 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.490813 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.593209 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.593266 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.593314 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.593340 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.593353 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.696391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.696451 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.696476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.696496 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.696512 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.799204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.799288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.799298 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.799319 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.799330 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.902153 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.902200 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.902215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.902235 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:20:59 crc kubenswrapper[4676]: I1204 15:20:59.902250 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:20:59Z","lastTransitionTime":"2025-12-04T15:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.005836 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.005878 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.005889 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.005925 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.005935 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.109315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.109756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.109922 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.110017 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.110103 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.212405 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.212469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.212484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.212510 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.212527 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.317123 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.317180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.317196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.317217 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.317230 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.384178 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:00 crc kubenswrapper[4676]: E1204 15:21:00.384384 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.420534 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.420583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.420597 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.420620 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.420635 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.523427 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.523496 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.523510 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.523535 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.523550 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.627180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.627230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.627238 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.627257 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.627268 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.730097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.730132 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.730142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.730160 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.730174 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.833349 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.833419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.833433 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.833455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.833476 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.936708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.936761 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.936774 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.936793 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:00 crc kubenswrapper[4676]: I1204 15:21:00.936806 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:00Z","lastTransitionTime":"2025-12-04T15:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.040248 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.040369 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.040390 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.040415 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.040441 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.143381 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.143450 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.143461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.143480 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.143490 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.246479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.246544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.246564 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.246584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.246594 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.349750 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.349797 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.349808 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.349826 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.349836 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.384302 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.384356 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.384329 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:01 crc kubenswrapper[4676]: E1204 15:21:01.384573 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:01 crc kubenswrapper[4676]: E1204 15:21:01.384638 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:01 crc kubenswrapper[4676]: E1204 15:21:01.384731 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.453288 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.453336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.453348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.453373 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.453390 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.556712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.556769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.556783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.556805 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.556830 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.660249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.660304 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.660319 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.660339 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.660360 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.763085 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.763425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.763577 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.763685 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.763777 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.867278 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.867356 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.867398 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.867426 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.867446 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.970890 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.970956 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.970967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.970986 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:01 crc kubenswrapper[4676]: I1204 15:21:01.970997 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:01Z","lastTransitionTime":"2025-12-04T15:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.074341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.074448 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.074464 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.074486 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.074504 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.177999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.178054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.178065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.178086 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.178110 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.280763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.280810 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.280823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.280840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.280855 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.383533 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:02 crc kubenswrapper[4676]: E1204 15:21:02.383820 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.384049 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.384070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.384082 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.384119 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.384138 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.487874 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.487959 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.487974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.487995 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.488022 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.591438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.591505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.591520 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.591542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.591560 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.694351 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.694408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.694422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.694441 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.694456 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.798164 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.798215 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.798228 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.798251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.798264 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.901456 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.901507 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.901517 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.901536 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:02 crc kubenswrapper[4676]: I1204 15:21:02.901547 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:02Z","lastTransitionTime":"2025-12-04T15:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.005082 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.005134 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.005145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.005166 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.005178 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.108778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.108833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.108844 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.108864 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.108875 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.212375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.212444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.212457 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.212479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.212492 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.316137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.316179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.316194 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.316212 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.316223 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.384284 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.384312 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:03 crc kubenswrapper[4676]: E1204 15:21:03.384530 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.384343 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:03 crc kubenswrapper[4676]: E1204 15:21:03.384833 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:03 crc kubenswrapper[4676]: E1204 15:21:03.384881 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.402154 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.417500 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.418201 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.418235 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.418245 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.418263 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.418274 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.431594 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.442344 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.459949 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.475824 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.496117 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.514923 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.521991 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.522484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.522743 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.523003 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.523204 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.533236 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.555898 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25ef23e3ee2ac21cd9c2031873656ab7784ac20cc3b31cb333a4b919d2760149\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:38Z\\\",\\\"message\\\":\\\"ernalversions/factory.go:140\\\\nI1204 15:20:38.902366 6131 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1204 15:20:38.906083 6131 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1204 15:20:38.906140 6131 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1204 15:20:38.906196 6131 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:20:38.906194 6131 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:20:38.906252 6131 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:20:38.906281 6131 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1204 15:20:38.906308 6131 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1204 15:20:38.906310 6131 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1204 15:20:38.906362 6131 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1204 15:20:38.906398 6131 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:20:38.906422 6131 factory.go:656] Stopping watch factory\\\\nI1204 15:20:38.906435 6131 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:20:38.906465 6131 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1204 15:20:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.573845 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.597063 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.618074 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.625858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.625931 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.625946 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.625966 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.625978 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.636543 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.655056 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.684326 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.704153 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:03Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.869438 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.869492 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.869504 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.869523 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.869535 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.972273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.972315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.972327 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.972349 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:03 crc kubenswrapper[4676]: I1204 15:21:03.972362 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:03Z","lastTransitionTime":"2025-12-04T15:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.074550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.074597 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.074606 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.074623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.074635 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.177662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.177712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.177723 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.177748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.177772 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.280668 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.280718 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.280728 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.280746 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.280757 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.383607 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:04 crc kubenswrapper[4676]: E1204 15:21:04.383786 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.384018 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.384058 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.384068 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.384084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.384094 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.486967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.487037 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.487057 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.487083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.487098 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.590162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.590239 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.590253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.590327 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.590346 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.692768 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.693129 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.693230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.693318 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.693395 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.795883 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.795945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.795956 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.795971 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.795981 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.899189 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.899619 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.899689 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.899758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:04 crc kubenswrapper[4676]: I1204 15:21:04.899824 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:04Z","lastTransitionTime":"2025-12-04T15:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.002739 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.003111 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.003162 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.003192 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.003221 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.107366 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.107427 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.107443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.107468 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.107482 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.211208 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.211263 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.211275 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.211297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.211309 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.255074 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.255130 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.255144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.255169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.255184 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.272495 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:05Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.278313 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.278374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.278391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.278413 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.278432 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.295417 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:05Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.299856 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.299890 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.299923 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.299950 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.299961 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.313239 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:05Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.319324 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.319373 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.319388 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.319407 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.319420 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.334627 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:05Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.340735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.340774 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.340783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.340802 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.340812 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.355110 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:05Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.355333 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.357426 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.357461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.357469 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.357488 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.357499 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.383590 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.383678 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.383813 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.383969 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.384069 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:05 crc kubenswrapper[4676]: E1204 15:21:05.384217 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.461174 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.461219 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.461229 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.461248 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.461259 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.564361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.564412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.564424 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.564447 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.564460 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.667395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.667454 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.667465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.667485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.667502 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.770242 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.770295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.770306 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.770327 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.770338 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.872617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.872673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.872686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.872708 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.872724 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.975207 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.975270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.975284 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.975307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:05 crc kubenswrapper[4676]: I1204 15:21:05.975321 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:05Z","lastTransitionTime":"2025-12-04T15:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.078824 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.078899 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.078943 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.078965 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.078978 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.182105 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.182172 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.182188 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.182216 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.182251 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.286656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.286710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.286723 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.286744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.286758 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.383622 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:06 crc kubenswrapper[4676]: E1204 15:21:06.383821 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.389951 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.389989 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.389998 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.390014 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.390026 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.493407 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.493458 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.493471 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.493491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.493509 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.597060 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.597125 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.597138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.597165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.597181 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.699868 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.699945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.699959 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.699981 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.699993 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.802580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.802652 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.802669 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.802693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:06 crc kubenswrapper[4676]: I1204 15:21:06.802710 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:06Z","lastTransitionTime":"2025-12-04T15:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.075403 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.075437 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.075446 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.075462 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.075474 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.178385 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.178436 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.178448 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.178472 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.178487 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.281409 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.281443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.281451 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.281467 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.281479 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384143 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384163 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:07 crc kubenswrapper[4676]: E1204 15:21:07.384302 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384333 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:07 crc kubenswrapper[4676]: E1204 15:21:07.384438 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:07 crc kubenswrapper[4676]: E1204 15:21:07.384504 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384701 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.384742 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.487923 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.488008 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.488029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.488053 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.488070 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.591639 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.592108 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.592230 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.592303 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.592421 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.695714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.695775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.695789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.695815 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.695829 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.798662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.799059 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.799181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.799367 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.799546 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.902575 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.902629 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.902643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.902662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:07 crc kubenswrapper[4676]: I1204 15:21:07.902676 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:07Z","lastTransitionTime":"2025-12-04T15:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.009261 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.009341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.009353 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.009374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.009448 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.112890 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.112951 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.112960 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.112999 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.113009 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.216432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.216519 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.216545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.216578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.216627 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.320344 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.320432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.320455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.320487 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.320507 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.383574 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.383610 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:08 crc kubenswrapper[4676]: E1204 15:21:08.384168 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:21:08 crc kubenswrapper[4676]: E1204 15:21:08.384389 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:21:40.384362961 +0000 UTC m=+107.819032918 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:21:08 crc kubenswrapper[4676]: E1204 15:21:08.384072 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.384830 4676 scope.go:117] "RemoveContainer" containerID="ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5" Dec 04 15:21:08 crc kubenswrapper[4676]: E1204 15:21:08.385360 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.404645 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.417797 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.424631 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.424671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.424681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.424701 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.424711 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.434040 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.459112 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.480081 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.494505 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.508690 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.526600 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.527408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.527454 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.527466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.527485 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.527497 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.542255 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.559426 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.573876 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.587993 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.603077 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.621142 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.630375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.630410 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.630420 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.630441 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.630453 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.637057 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.650627 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.664619 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:08Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.734234 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.734299 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.734310 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.734332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.734350 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.837245 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.837308 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.837320 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.837341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.837357 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.940939 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.941032 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.941042 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.941058 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:08 crc kubenswrapper[4676]: I1204 15:21:08.941069 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:08Z","lastTransitionTime":"2025-12-04T15:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.044782 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.044830 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.044839 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.044861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.044876 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.147940 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.147979 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.147989 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.148007 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.148016 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.250642 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.250700 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.250714 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.250733 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.250768 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.353693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.353750 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.353763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.353783 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.354156 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.383350 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.383410 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.383499 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:09 crc kubenswrapper[4676]: E1204 15:21:09.383551 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:09 crc kubenswrapper[4676]: E1204 15:21:09.383609 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:09 crc kubenswrapper[4676]: E1204 15:21:09.383670 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.457869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.457964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.457975 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.457991 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.458003 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.561221 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.561293 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.561306 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.561333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.561356 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.664271 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.664321 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.664338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.664357 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.664368 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.768181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.768432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.768445 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.768465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.768484 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.871955 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.872027 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.872054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.872083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.872101 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.975489 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.975545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.975556 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.975580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:09 crc kubenswrapper[4676]: I1204 15:21:09.975601 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:09Z","lastTransitionTime":"2025-12-04T15:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.079354 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.079416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.079432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.079455 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.079467 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.183054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.183117 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.183132 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.183156 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.183170 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.286722 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.286769 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.286778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.286798 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.286809 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.383695 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:10 crc kubenswrapper[4676]: E1204 15:21:10.383984 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.390348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.390399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.390411 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.390431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.390443 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.493453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.493508 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.493519 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.493539 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.493555 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.596483 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.596568 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.596578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.596597 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.596637 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.699731 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.699778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.699790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.699810 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.699821 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.803275 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.803320 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.803329 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.803347 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.803363 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.906763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.906817 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.906828 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.906849 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:10 crc kubenswrapper[4676]: I1204 15:21:10.906861 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:10Z","lastTransitionTime":"2025-12-04T15:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.009551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.009591 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.009601 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.009617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.009630 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.111952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.111998 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.112008 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.112024 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.112034 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.215129 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.215197 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.215210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.215236 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.215248 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.317821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.317983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.318010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.318029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.318042 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.383873 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.383877 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:11 crc kubenswrapper[4676]: E1204 15:21:11.384098 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.384279 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:11 crc kubenswrapper[4676]: E1204 15:21:11.384342 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:11 crc kubenswrapper[4676]: E1204 15:21:11.384557 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.421104 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.421160 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.421177 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.421196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.421217 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.524517 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.524604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.524617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.524639 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.524655 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.628074 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.628124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.628138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.628157 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.628174 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.730964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.731007 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.731018 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.731035 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.731046 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.834633 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.834691 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.834703 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.834721 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.834733 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.937309 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.937364 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.937376 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.937395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:11 crc kubenswrapper[4676]: I1204 15:21:11.937407 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:11Z","lastTransitionTime":"2025-12-04T15:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.040688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.040743 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.040755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.040778 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.040794 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.143729 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.143781 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.143793 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.143814 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.143827 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.246451 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.246511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.246522 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.246545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.246556 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.349505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.349564 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.349577 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.349595 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.349608 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.383962 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:12 crc kubenswrapper[4676]: E1204 15:21:12.384206 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.453552 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.453617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.453630 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.453653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.453666 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.556745 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.556795 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.556804 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.556819 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.556830 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.660609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.661004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.661019 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.661038 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.661111 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.764139 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.764193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.764206 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.764258 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.764275 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.867390 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.867440 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.867453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.867473 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.867484 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.970348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.970392 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.970403 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.970419 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:12 crc kubenswrapper[4676]: I1204 15:21:12.970429 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:12Z","lastTransitionTime":"2025-12-04T15:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.072711 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.072765 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.072775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.072794 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.072806 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.175602 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.175643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.175652 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.175687 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.175698 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.278219 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.278270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.278282 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.278302 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.278314 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.380820 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.380871 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.380882 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.380920 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.380933 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.383546 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:13 crc kubenswrapper[4676]: E1204 15:21:13.383661 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.383822 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:13 crc kubenswrapper[4676]: E1204 15:21:13.383924 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.384212 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:13 crc kubenswrapper[4676]: E1204 15:21:13.384435 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.401190 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.416749 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.429403 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.442672 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.457221 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.471150 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.483444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.483499 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.483513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.483544 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.483560 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.485709 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.497639 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.519107 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.535595 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.555401 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.569638 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.586402 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.586449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.586460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.586476 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.586492 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.589368 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.606769 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.625354 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.639559 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.656891 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:13Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.689161 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.689214 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.689224 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.689247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.689264 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.792641 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.792756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.792770 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.792792 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.792807 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.895982 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.896048 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.896066 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.896092 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:13 crc kubenswrapper[4676]: I1204 15:21:13.896110 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:13Z","lastTransitionTime":"2025-12-04T15:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.000341 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.000415 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.000434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.000461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.000477 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.104547 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.104603 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.104614 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.104634 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.104645 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.208566 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.208623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.208641 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.208667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.208681 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.312186 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.312249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.312263 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.312286 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.312301 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.384367 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:14 crc kubenswrapper[4676]: E1204 15:21:14.384577 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.415750 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.415832 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.415849 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.415873 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.415888 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.518513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.518566 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.518578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.518598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.518618 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.621935 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.621981 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.621990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.622011 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.622124 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.725399 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.725453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.725467 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.725488 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.725500 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.828551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.828592 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.828601 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.828616 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.828629 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.932004 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.932046 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.932062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.932084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:14 crc kubenswrapper[4676]: I1204 15:21:14.932098 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:14Z","lastTransitionTime":"2025-12-04T15:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.035249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.035301 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.035315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.035338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.035353 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.109205 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/0.log" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.109300 4676 generic.go:334] "Generic (PLEG): container finished" podID="2a201486-d4f3-4677-adad-4028d94e0623" containerID="67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45" exitCode=1 Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.109352 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerDied","Data":"67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.109968 4676 scope.go:117] "RemoveContainer" containerID="67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.135146 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.138753 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.138809 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.138823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.138845 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.138856 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.151607 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.182294 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.263343 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.263446 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.263504 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.263528 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.263542 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.277050 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.294023 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.313209 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.330446 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:21:14Z\\\",\\\"message\\\":\\\"2025-12-04T15:20:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58\\\\n2025-12-04T15:20:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58 to /host/opt/cni/bin/\\\\n2025-12-04T15:20:29Z [verbose] multus-daemon started\\\\n2025-12-04T15:20:29Z [verbose] Readiness Indicator file check\\\\n2025-12-04T15:21:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.348048 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.363883 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.366425 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.366461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.366472 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.366492 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.366505 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.369608 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.369758 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.370028 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.370167 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.370268 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.383633 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.383644 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.383815 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.383874 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.384070 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.384247 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.385048 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.386086 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.394444 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.394497 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.394513 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.394538 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.394554 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.402132 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.422966 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.428547 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.428596 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.428925 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.429124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.429155 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.429168 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.446023 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.449224 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.452594 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.452766 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.452846 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.452966 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.453053 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.469830 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.471148 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.475439 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.475492 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.475562 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.475583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.475593 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.485240 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.491151 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: E1204 15:21:15.491324 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.493520 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.493640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.493809 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.494026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.494190 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.501596 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.517045 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:15Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.598467 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.598879 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.598977 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.599057 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.599128 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.702270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.702931 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.703015 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.703082 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.703140 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.805740 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.806105 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.806171 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.806270 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.806332 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.909805 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.910220 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.910297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.910387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:15 crc kubenswrapper[4676]: I1204 15:21:15.910606 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:15Z","lastTransitionTime":"2025-12-04T15:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.014067 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.014423 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.014511 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.014588 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.014648 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.117785 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/0.log" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.117946 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerStarted","Data":"ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.118710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.118893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.119212 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.119537 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.119702 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.143561 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.163123 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.187119 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.209257 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.223447 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.223502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.223521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.223549 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.223569 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.232407 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.255351 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.277645 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.301370 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.325724 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.327348 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.327393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.327417 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.327450 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.327474 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.349301 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.367531 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.384290 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:16 crc kubenswrapper[4676]: E1204 15:21:16.384448 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.391776 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.399587 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.405201 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.416983 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.430775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.430821 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.430833 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.430871 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.430886 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.436161 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.454604 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:21:14Z\\\",\\\"message\\\":\\\"2025-12-04T15:20:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58\\\\n2025-12-04T15:20:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58 to /host/opt/cni/bin/\\\\n2025-12-04T15:20:29Z [verbose] multus-daemon started\\\\n2025-12-04T15:20:29Z [verbose] Readiness Indicator file check\\\\n2025-12-04T15:21:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:21:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.470230 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:16Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.534016 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.534077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.534087 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.534104 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.534115 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.636622 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.636668 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.636679 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.636701 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.636719 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.739442 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.739478 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.739488 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.739505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.739515 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.843431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.843502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.843581 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.843614 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.843626 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.946243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.946289 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.946297 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.946312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:16 crc kubenswrapper[4676]: I1204 15:21:16.946322 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:16Z","lastTransitionTime":"2025-12-04T15:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.049693 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.049853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.049865 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.049881 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.049892 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.152422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.152488 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.152502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.152525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.152542 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.255432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.255491 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.255502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.255521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.255534 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.359355 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.359422 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.359434 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.359457 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.359470 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.384093 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.384144 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:17 crc kubenswrapper[4676]: E1204 15:21:17.384316 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.384419 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:17 crc kubenswrapper[4676]: E1204 15:21:17.384524 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:17 crc kubenswrapper[4676]: E1204 15:21:17.384606 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.461785 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.461850 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.461863 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.461883 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.461896 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.564757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.564825 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.564838 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.564860 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.564873 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.668525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.668574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.668586 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.668604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.668615 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.771393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.771766 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.771893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.772013 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.772111 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.875028 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.875075 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.875086 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.875102 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.875114 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.978368 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.978412 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.978423 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.978439 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:17 crc kubenswrapper[4676]: I1204 15:21:17.978450 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:17Z","lastTransitionTime":"2025-12-04T15:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.081584 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.081623 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.081632 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.081656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.081675 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.185089 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.185135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.185145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.185160 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.185170 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.288551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.288611 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.288621 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.288640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.288652 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.383688 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:18 crc kubenswrapper[4676]: E1204 15:21:18.384059 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.390953 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.391002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.391017 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.391040 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.391051 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.494007 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.494069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.494083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.494112 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.494128 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.596202 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.596247 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.596259 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.596277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.596287 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.698861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.698943 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.698958 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.698985 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.699003 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.805047 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.805113 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.805123 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.805145 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.805155 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.909772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.909848 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.909869 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.909894 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:18 crc kubenswrapper[4676]: I1204 15:21:18.909925 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:18Z","lastTransitionTime":"2025-12-04T15:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.013022 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.013074 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.013084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.013100 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.013109 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.116551 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.116611 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.116621 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.116643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.116654 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.220710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.220775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.220788 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.220812 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.220835 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.324580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.324640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.324651 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.324676 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.324688 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.383674 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.383674 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:19 crc kubenswrapper[4676]: E1204 15:21:19.383869 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.383702 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:19 crc kubenswrapper[4676]: E1204 15:21:19.383949 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:19 crc kubenswrapper[4676]: E1204 15:21:19.384038 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.427959 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.428011 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.428021 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.428039 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.428049 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.531038 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.531093 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.531101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.531120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.531130 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.634203 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.634261 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.634272 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.634292 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.634305 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.737073 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.737140 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.737155 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.737180 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.737197 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.840549 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.840626 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.840643 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.841064 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.841101 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.944574 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.944628 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.944638 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.944659 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:19 crc kubenswrapper[4676]: I1204 15:21:19.944673 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:19Z","lastTransitionTime":"2025-12-04T15:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.048316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.048386 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.048408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.048431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.048444 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.150691 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.150736 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.150746 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.150766 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.150778 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.333151 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.333202 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.333212 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.333229 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.333238 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.383665 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:20 crc kubenswrapper[4676]: E1204 15:21:20.384266 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.384508 4676 scope.go:117] "RemoveContainer" containerID="ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.435954 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.435996 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.436007 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.436025 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.436039 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.538838 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.539315 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.539331 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.539355 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.539370 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.643475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.643542 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.643569 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.643603 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.643627 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.746132 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.746181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.746193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.746208 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.746219 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.926394 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.926460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.926493 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.926519 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:20 crc kubenswrapper[4676]: I1204 15:21:20.926530 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:20Z","lastTransitionTime":"2025-12-04T15:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.029786 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.029827 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.029837 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.029854 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.029865 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.133064 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.133099 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.133110 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.133127 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.133139 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.138661 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/2.log" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.144551 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.145880 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.164687 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.181009 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.194809 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.215928 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:21:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.226665 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.226800 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.226867 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:25.226839585 +0000 UTC m=+152.661509442 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.226967 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.226986 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227001 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227042 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:22:25.227032951 +0000 UTC m=+152.661702808 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.227081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.227132 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.227168 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227292 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227305 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227334 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227353 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227356 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:22:25.22733195 +0000 UTC m=+152.662001807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227406 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:22:25.227391601 +0000 UTC m=+152.662063738 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227442 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.227470 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:22:25.227459493 +0000 UTC m=+152.662129350 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.232601 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.235981 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.236026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.236041 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.236069 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.236084 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.248621 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.265338 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.283897 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.304183 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:21:14Z\\\",\\\"message\\\":\\\"2025-12-04T15:20:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58\\\\n2025-12-04T15:20:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58 to /host/opt/cni/bin/\\\\n2025-12-04T15:20:29Z [verbose] multus-daemon started\\\\n2025-12-04T15:20:29Z [verbose] Readiness Indicator file check\\\\n2025-12-04T15:21:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:21:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.321221 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97794e51-7c92-49d6-bea4-5824d9485fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf579ed5bf7237ca102c3239090f593aa508f224de04b9c0b080aff84cc8afe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.339416 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.339465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.339479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.339507 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.339521 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.353335 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.369721 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.387017 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.387213 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.387287 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.387316 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.387374 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:21 crc kubenswrapper[4676]: E1204 15:21:21.387532 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.399862 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.415059 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.430945 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.442323 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.442363 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.442375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.442394 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.442404 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.447815 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.465695 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.478088 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:21Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.544962 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.545024 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.545035 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.545054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.545074 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.647861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.648255 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.648339 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.648424 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.648505 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.751057 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.751481 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.751587 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.751704 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.751815 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.854880 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.854934 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.854945 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.854964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.854976 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.957744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.957825 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.957840 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.957864 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:21 crc kubenswrapper[4676]: I1204 15:21:21.957879 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:21Z","lastTransitionTime":"2025-12-04T15:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.060827 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.060893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.060932 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.060964 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.060981 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.355223 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.355291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.355304 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.355325 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.355338 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.384056 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:22 crc kubenswrapper[4676]: E1204 15:21:22.384272 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.457886 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.457971 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.457992 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.458046 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.458060 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.560775 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.560830 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.560841 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.560861 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.560872 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.663461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.663507 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.663521 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.663541 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.663551 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.767077 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.767135 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.767147 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.767165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.767180 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.870530 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.870870 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.871037 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.871176 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.871245 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.974273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.974326 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.974338 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.974360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:22 crc kubenswrapper[4676]: I1204 15:21:22.974387 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:22Z","lastTransitionTime":"2025-12-04T15:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.076506 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.076583 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.076597 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.076618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.076631 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.179896 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.179974 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.179985 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.180005 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.180017 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.283223 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.283272 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.283284 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.283307 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.283319 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.384191 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.384249 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.384211 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:23 crc kubenswrapper[4676]: E1204 15:21:23.384774 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:23 crc kubenswrapper[4676]: E1204 15:21:23.384952 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:23 crc kubenswrapper[4676]: E1204 15:21:23.385076 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.391137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.392181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.392198 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.392640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.392707 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.411392 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.428606 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.444061 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.468093 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:21:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.485422 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.495604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.495655 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.495665 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.495686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.495698 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.509403 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.528591 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.554872 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.575589 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:21:14Z\\\",\\\"message\\\":\\\"2025-12-04T15:20:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58\\\\n2025-12-04T15:20:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58 to /host/opt/cni/bin/\\\\n2025-12-04T15:20:29Z [verbose] multus-daemon started\\\\n2025-12-04T15:20:29Z [verbose] Readiness Indicator file check\\\\n2025-12-04T15:21:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:21:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.589923 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97794e51-7c92-49d6-bea4-5824d9485fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf579ed5bf7237ca102c3239090f593aa508f224de04b9c0b080aff84cc8afe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.599590 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.599661 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.599676 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.599694 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.599725 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.609505 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.625281 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.639127 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.657368 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.676056 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.690633 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.702862 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.702952 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.702975 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.703000 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.703015 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.705570 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.718303 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:23Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.805716 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.805779 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.805790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.805820 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.805832 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.909276 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.909336 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.909349 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.909365 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:23 crc kubenswrapper[4676]: I1204 15:21:23.909375 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:23Z","lastTransitionTime":"2025-12-04T15:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.011879 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.011969 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.011983 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.012001 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.012012 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.115120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.115170 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.115183 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.115204 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.115215 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.161371 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/3.log" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.162187 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/2.log" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.166445 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" exitCode=1 Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.166535 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.167555 4676 scope.go:117] "RemoveContainer" containerID="ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.168944 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:21:24 crc kubenswrapper[4676]: E1204 15:21:24.169239 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.186050 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8eb55fe-960c-4215-a2d1-1a017e17b80b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://24fd95444022437bcb984a0b2128242f465430fc451ad6c2c4ae96b6ba8cf1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6645c57e6c5d7605eb80db785ebfec291c48d4410e0300364d944852778f574\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d35e166c5205495aa94070866a6629cdaa21369e958094c2d6feced3293d9b4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.197387 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dgffs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eba809fc-7400-4863-8e96-baae38c42001\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e501ce091858dbd24df9049a79c92c1941b567a5d7033a7068b84ca999a424e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rmr2q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:23Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dgffs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.208403 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97794e51-7c92-49d6-bea4-5824d9485fac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cf579ed5bf7237ca102c3239090f593aa508f224de04b9c0b080aff84cc8afe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db5b8c9147a68093513edec9e2f5eb9b1f64bafc5aff9b5e907090b7f5292b71\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.218480 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.218539 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.218553 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.218576 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.218591 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.223408 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-04T15:20:16Z\\\",\\\"message\\\":\\\"le observer\\\\nW1204 15:20:16.358381 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1204 15:20:16.358946 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1204 15:20:16.361027 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2656649520/tls.crt::/tmp/serving-cert-2656649520/tls.key\\\\\\\"\\\\nI1204 15:20:16.897673 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1204 15:20:16.902237 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1204 15:20:16.902267 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1204 15:20:16.902338 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1204 15:20:16.902346 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1204 15:20:16.912371 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1204 15:20:16.912420 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912426 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1204 15:20:16.912431 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1204 15:20:16.912434 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1204 15:20:16.912437 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1204 15:20:16.912440 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1204 15:20:16.912997 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1204 15:20:16.915040 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:57Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.241852 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56cfbac5f60f0eecfcf0186b64397baf82e952a9f0124404fc9c0ce9f73d12b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d75ecf7f1b78befece8b3544dbbc0839f6195be070c0ef0bdef5277d15673365\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.254017 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2362781-61ed-4bed-b752-d89d5808d9fe\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2fdc8eaa1bb45491f38724762c06529b1a9b73f1f400ef0a9ca3ba3830895fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56898c9ca8502c9a61b6ceb8159fe412f3b1117df4711ba075c7112338917883\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jgtzt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-wldgd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.265472 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"711742b9-8c03-4234-ae1d-4d7d3baa4217\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6xw6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-nsvkq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.278538 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f32f0d3-f65b-4255-809b-351615963135\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a6708565d39d783799eb319ad5fc7f4121504bb25807d55c43c54cb1468d447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b4ee7bb82f3e96054fe054d24d744bd3d919c595368e20e7b2a4bcde0ba02f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3661fc3c4f1f996c4f0ac868ea52b468a8a6c641581b49483c5dc941986ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:19:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://faa5f945b58ecc39ce30f107d2964182c20b7df92d3fda034f96861aa55946f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:19:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:19:54Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:19:53Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.290641 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.303639 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.314674 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b3eca9b5-0269-40ad-8bc1-142e702d9454\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10e26cafad549a293a26645012bcc62b23c28046921e0e6d2d0fb663b4d13360\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lh2vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-5s6p9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.321165 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.321214 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.321226 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.321243 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.321255 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.335321 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bcf3dbe3018cea30edd0f670d022260398517e53e48fd9b553511bae7baa5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:20:56Z\\\",\\\"message\\\":\\\" to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:20:55Z is after 2025-08-24T17:21:41Z]\\\\nI1204 15:20:55.956191 6346 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-dgffs openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd openshift-kube-controller-manager/kube-controller-manager-crc openshift-multus/network-metrics-daemon-nsvkq openshift-network-diagnostics/network-check-target-xd92c openshift-multus/multus-additional-cni-plugins-f8vjl openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-node-wmbt2 openshift-network-operator/iptables-alerter-4ln5h openshift-dns/node-resolver-9bc4z openshift-kube-apiserver/kube-apiserver-crc openshift-multus/multus-wch9m]\\\\nI1204 15:20:55.956187 6346 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress/router-internal-default]} name:Service_openshift-ingress/router-i\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:21:23Z\\\",\\\"message\\\":\\\"s.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:21:22.558429 6649 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1204 15:21:22.558573 6649 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1204 15:21:22.558631 6649 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1204 15:21:22.558880 6649 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:21:22.559008 6649 handler.go:208] Removed *v1.Node event handler 2\\\\nI1204 15:21:22.558876 6649 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1204 15:21:22.559208 6649 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1204 15:21:22.559374 6649 handler.go:208] Removed *v1.Node event handler 7\\\\nI1204 15:21:22.559749 6649 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1204 15:21:22.559787 6649 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1204 15:21:22.559809 6649 factory.go:656] Stopping watch factory\\\\nI1204 15:21:22.559824 6649 ovnkube.go:599] Stopped ovnkube\\\\nI1204 15:21:22.559846 6649 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1204 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:21:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6j6vk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-wmbt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.351616 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22cdb74a2029dd01aaf70248ff26455924b4545579acba95057317811b0dca33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.362434 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-9bc4z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0eaaf25e-b575-426f-9967-d81ac3c882ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26b6937f7fb7ca7d683e86e8d081d7d2f5cd881b7071a9c6f4ef9748ae40bfa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7x6p4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:19Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-9bc4z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.378558 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3f9795f2-fd74-48a2-af9c-90e7d47ab178\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a43f38a1eb2c9b0f4d8f2ee3b03c880766ded7b40402dfea9097fc7ecd2853d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://975f241e74043c648f821c3cc383688426464957653afa792d00fdc93fc25f69\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://25055ecb9480bc3b49e8811017803b4cfa42e1d8b4b82dfeb979b7a9c5c3bb01\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb923fee45db7d292e367d8d622370d12a765713106bd7e44f99d3f24b648b4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://546f9dc56a93e450a6bec3c04a884d3c5ddc45b5e63c091e2670dacd81830bb0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5078e1910ca967bfac8de1d46b7eb42434ca5e27b5795575e13668dedc3b415\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://99ffc9b380df75275d453a7e8ca75658ccd4cd4f85f9f8c2abb1f72954627680\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:20:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-frh24\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-f8vjl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.383524 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:24 crc kubenswrapper[4676]: E1204 15:21:24.383686 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.392660 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-wch9m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a201486-d4f3-4677-adad-4028d94e0623\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-04T15:21:14Z\\\",\\\"message\\\":\\\"2025-12-04T15:20:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58\\\\n2025-12-04T15:20:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2071f486-2e30-4ac1-a311-7c96d3bd4c58 to /host/opt/cni/bin/\\\\n2025-12-04T15:20:29Z [verbose] multus-daemon started\\\\n2025-12-04T15:20:29Z [verbose] Readiness Indicator file check\\\\n2025-12-04T15:21:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-04T15:20:20Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:21:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxq8f\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-04T15:20:20Z\\\"}}\" for pod \"openshift-multus\"/\"multus-wch9m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.406335 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.419634 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:20:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6fe6cbef733fd5ff1d7ba7bcafb48c40ca338a14cfd4d587aa24a96642e62bbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-04T15:20:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:24Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.424609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.424645 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.424654 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.424671 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.424683 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.527095 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.527155 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.527169 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.527201 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.527215 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.629515 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.629569 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.629580 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.629609 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.629627 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.732963 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.733042 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.733053 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.733072 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.733084 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.836853 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.836995 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.837038 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.837062 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.837076 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.939449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.939585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.939598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.939619 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:24 crc kubenswrapper[4676]: I1204 15:21:24.939630 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:24Z","lastTransitionTime":"2025-12-04T15:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.042590 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.042645 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.042656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.042676 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.042692 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.145184 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.145227 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.145237 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.145252 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.145262 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.173933 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/3.log" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.248385 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.248450 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.248460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.248480 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.248491 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.351335 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.351480 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.351498 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.351554 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.351572 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.384452 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.384520 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.384785 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.384817 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.384981 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.385084 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.455305 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.455382 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.455397 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.455421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.455459 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.558505 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.558604 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.558620 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.558640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.558653 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.590826 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.590896 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.590941 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.590963 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.590977 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.606167 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.611354 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.611406 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.611418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.611443 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.611463 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.625566 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.635002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.635070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.635083 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.635109 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.635123 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.649452 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.654634 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.654712 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.654730 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.654755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.654777 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.669625 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.674079 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.674127 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.674142 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.674205 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.674221 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.688132 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-04T15:21:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4574455b-7b00-4e77-9815-81145b03a6ca\\\",\\\"systemUUID\\\":\\\"7171a43d-58aa-4be8-82e2-5e1d4cb4902b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-04T15:21:25Z is after 2025-08-24T17:21:41Z" Dec 04 15:21:25 crc kubenswrapper[4676]: E1204 15:21:25.688266 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.690545 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.690586 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.690598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.690619 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.690636 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.793872 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.793967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.793982 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.794009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.794024 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.896396 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.896465 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.896477 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.896500 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.896513 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.999715 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.999763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.999773 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:25 crc kubenswrapper[4676]: I1204 15:21:25.999790 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:25.999802 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:25Z","lastTransitionTime":"2025-12-04T15:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.105090 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.105156 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.105168 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.105196 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.105207 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.208179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.208236 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.208246 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.208267 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.208279 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.311897 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.312012 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.312027 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.312050 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.312088 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.383818 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:26 crc kubenswrapper[4676]: E1204 15:21:26.384052 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.415343 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.415415 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.415431 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.415457 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.415467 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.519686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.519759 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.519772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.519789 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.519799 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.622159 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.622221 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.622232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.622253 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.622268 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.725704 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.725760 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.725771 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.725791 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.725804 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.829073 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.829124 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.829138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.829158 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.829170 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.932645 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.932698 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.932710 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.932734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:26 crc kubenswrapper[4676]: I1204 15:21:26.932747 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:26Z","lastTransitionTime":"2025-12-04T15:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.036327 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.036380 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.036393 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.036415 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.036427 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.139835 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.139895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.139907 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.139942 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.139953 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.243360 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.243417 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.243432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.243449 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.243460 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.346112 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.346181 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.346198 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.346218 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.346231 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.384254 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.384351 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.384313 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:27 crc kubenswrapper[4676]: E1204 15:21:27.384471 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:27 crc kubenswrapper[4676]: E1204 15:21:27.384524 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:27 crc kubenswrapper[4676]: E1204 15:21:27.384712 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.449427 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.449490 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.449509 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.449531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.449544 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.552395 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.552472 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.552495 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.552523 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.552539 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.656490 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.656560 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.656573 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.656596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.656615 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.759635 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.759676 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.759686 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.759702 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.759712 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.862564 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.862629 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.862640 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.862665 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.862678 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.965571 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.965806 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.965823 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.965843 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:27 crc kubenswrapper[4676]: I1204 15:21:27.965887 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:27Z","lastTransitionTime":"2025-12-04T15:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.068978 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.069026 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.069042 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.069065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.069086 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.172002 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.172061 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.172074 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.172103 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.172116 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.275052 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.275179 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.275194 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.275210 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.275221 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.379273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.379316 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.379326 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.379346 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.379358 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.383901 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:28 crc kubenswrapper[4676]: E1204 15:21:28.384185 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.481667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.481749 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.481763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.481786 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.481799 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.585192 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.585250 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.585260 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.585280 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.585292 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.688453 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.688500 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.688512 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.688531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.688542 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.791684 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.791723 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.791735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.791751 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.791761 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.894141 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.894203 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.894222 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.894248 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.894265 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.997667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.997723 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.997737 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.997754 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:28 crc kubenswrapper[4676]: I1204 15:21:28.997765 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:28Z","lastTransitionTime":"2025-12-04T15:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.101420 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.101477 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.101489 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.101509 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.101520 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.204309 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.204377 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.204388 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.204409 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.204420 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.306969 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.307031 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.307044 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.307065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.307077 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.383457 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.383521 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.383540 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:29 crc kubenswrapper[4676]: E1204 15:21:29.383666 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:29 crc kubenswrapper[4676]: E1204 15:21:29.383829 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:29 crc kubenswrapper[4676]: E1204 15:21:29.383966 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.409855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.409899 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.409925 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.409940 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.409950 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.512990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.513075 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.513093 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.513182 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.513202 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.616273 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.616333 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.616367 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.616396 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.616414 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.720543 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.720638 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.720649 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.720667 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.720678 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.823892 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.824277 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.824290 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.824310 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.824321 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.927201 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.927251 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.927262 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.927280 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:29 crc kubenswrapper[4676]: I1204 15:21:29.927294 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:29Z","lastTransitionTime":"2025-12-04T15:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.030421 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.030466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.030479 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.030497 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.030510 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.133986 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.134054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.134071 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.134092 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.134109 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.236880 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.236953 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.236967 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.236990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.237003 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.339665 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.339722 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.339733 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.339753 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.339766 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.383276 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:30 crc kubenswrapper[4676]: E1204 15:21:30.383473 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.405861 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.442986 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.443037 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.443047 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.443066 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.443075 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.546525 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.546585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.546599 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.546621 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.546634 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.649647 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.649688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.649702 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.649720 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.649730 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.752707 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.752746 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.752756 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.752772 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.752782 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.856648 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.856706 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.856728 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.856757 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.856779 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.963762 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.964007 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.964332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.964361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:30 crc kubenswrapper[4676]: I1204 15:21:30.964376 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:30Z","lastTransitionTime":"2025-12-04T15:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.067662 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.067715 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.067726 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.067746 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.067759 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.170618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.170719 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.170732 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.170748 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.170760 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.274618 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.274653 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.274664 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.274688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.274706 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.377960 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.378029 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.378050 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.378084 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.378098 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.384429 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.384429 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.384435 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:31 crc kubenswrapper[4676]: E1204 15:21:31.384749 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:31 crc kubenswrapper[4676]: E1204 15:21:31.384859 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:31 crc kubenswrapper[4676]: E1204 15:21:31.385116 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.481827 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.481885 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.481900 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.481943 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.481955 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.585418 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.585466 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.585475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.585493 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.585505 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.687845 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.687886 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.687895 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.687927 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.687938 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.790954 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.791003 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.791015 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.791035 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.791048 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.894432 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.894475 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.894484 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.894502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.894513 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.996929 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.996990 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.997010 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.997030 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:31 crc kubenswrapper[4676]: I1204 15:21:31.997042 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:31Z","lastTransitionTime":"2025-12-04T15:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.101009 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.101101 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.101117 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.101144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.101158 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.203958 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.204024 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.204043 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.204070 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.204145 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.306795 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.306848 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.306858 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.306874 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.306887 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.383682 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:32 crc kubenswrapper[4676]: E1204 15:21:32.384363 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.409673 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.410085 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.410156 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.410241 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.410303 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.513578 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.513644 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.513656 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.513681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.513694 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.616744 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.617144 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.617178 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.617203 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.617220 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.719893 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.719976 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.719989 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.720015 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.720029 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.823249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.823291 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.823300 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.823318 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.823330 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.926884 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.926969 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.926984 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.927003 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:32 crc kubenswrapper[4676]: I1204 15:21:32.927040 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:32Z","lastTransitionTime":"2025-12-04T15:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.030040 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.030109 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.030120 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.030137 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.030148 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.133220 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.133268 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.133278 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.133295 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.133306 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.236138 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.236193 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.236212 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.236232 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.236243 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.338446 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.338493 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.338502 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.338518 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.338529 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.384326 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.384429 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:33 crc kubenswrapper[4676]: E1204 15:21:33.384524 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:33 crc kubenswrapper[4676]: E1204 15:21:33.384639 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.384429 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:33 crc kubenswrapper[4676]: E1204 15:21:33.384739 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.430183 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=3.430120755 podStartE2EDuration="3.430120755s" podCreationTimestamp="2025-12-04 15:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.429794525 +0000 UTC m=+100.864464402" watchObservedRunningTime="2025-12-04 15:21:33.430120755 +0000 UTC m=+100.864790612" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.444529 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.444585 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.444596 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.444614 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.444625 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.513218 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podStartSLOduration=75.51319236 podStartE2EDuration="1m15.51319236s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.480754327 +0000 UTC m=+100.915424214" watchObservedRunningTime="2025-12-04 15:21:33.51319236 +0000 UTC m=+100.947862227" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.546344 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.546688 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.546779 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.546872 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.547163 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.602724 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9bc4z" podStartSLOduration=75.602696143 podStartE2EDuration="1m15.602696143s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.569035594 +0000 UTC m=+101.003705471" watchObservedRunningTime="2025-12-04 15:21:33.602696143 +0000 UTC m=+101.037366000" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.602887 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-f8vjl" podStartSLOduration=75.602881259 podStartE2EDuration="1m15.602881259s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.590965702 +0000 UTC m=+101.025635559" watchObservedRunningTime="2025-12-04 15:21:33.602881259 +0000 UTC m=+101.037551146" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.616740 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wch9m" podStartSLOduration=75.616720041 podStartE2EDuration="1m15.616720041s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.616408212 +0000 UTC m=+101.051078069" watchObservedRunningTime="2025-12-04 15:21:33.616720041 +0000 UTC m=+101.051389898" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.645840 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=77.645811227 podStartE2EDuration="1m17.645811227s" podCreationTimestamp="2025-12-04 15:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.644264512 +0000 UTC m=+101.078934389" watchObservedRunningTime="2025-12-04 15:21:33.645811227 +0000 UTC m=+101.080481084" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.646105 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.646099145 podStartE2EDuration="17.646099145s" podCreationTimestamp="2025-12-04 15:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.628515104 +0000 UTC m=+101.063184961" watchObservedRunningTime="2025-12-04 15:21:33.646099145 +0000 UTC m=+101.080769002" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.649611 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.649824 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.649898 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.649997 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.650069 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.663844 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.663826111 podStartE2EDuration="1m10.663826111s" podCreationTimestamp="2025-12-04 15:20:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.663508271 +0000 UTC m=+101.098178128" watchObservedRunningTime="2025-12-04 15:21:33.663826111 +0000 UTC m=+101.098495958" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.679888 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dgffs" podStartSLOduration=75.679861727 podStartE2EDuration="1m15.679861727s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.6785803 +0000 UTC m=+101.113250157" watchObservedRunningTime="2025-12-04 15:21:33.679861727 +0000 UTC m=+101.114531594" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.693244 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=51.693212765 podStartE2EDuration="51.693212765s" podCreationTimestamp="2025-12-04 15:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.69269763 +0000 UTC m=+101.127367507" watchObservedRunningTime="2025-12-04 15:21:33.693212765 +0000 UTC m=+101.127882642" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.743206 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-wldgd" podStartSLOduration=74.743177798 podStartE2EDuration="1m14.743177798s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:33.740420628 +0000 UTC m=+101.175090485" watchObservedRunningTime="2025-12-04 15:21:33.743177798 +0000 UTC m=+101.177847665" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.756690 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.756752 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.756763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.756782 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.756793 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.859441 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.859497 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.859509 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.859531 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.859547 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.963427 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.963504 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.963523 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.963550 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:33 crc kubenswrapper[4676]: I1204 15:21:33.963567 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:33Z","lastTransitionTime":"2025-12-04T15:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.066212 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.066262 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.066274 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.066294 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.066306 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.168635 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.168695 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.168706 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.168727 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.168738 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.271699 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.271735 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.271747 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.271763 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.271776 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.374371 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.374446 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.374461 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.374481 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.374501 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.383732 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:34 crc kubenswrapper[4676]: E1204 15:21:34.383863 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.479374 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.479428 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.479441 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.479460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.479471 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.582385 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.582446 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.582460 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.582481 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.582494 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.685670 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.685720 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.685734 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.685755 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.685770 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.788681 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.788739 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.788749 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.788768 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.788786 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.891361 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.891796 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.891928 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.892005 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.892066 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.994898 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.994988 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.995000 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.995017 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:34 crc kubenswrapper[4676]: I1204 15:21:34.995028 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:34Z","lastTransitionTime":"2025-12-04T15:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.097537 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.097587 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.097598 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.097617 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.097631 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.200557 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.201012 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.201097 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.201174 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.201241 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.304391 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.304463 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.304482 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.304503 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.304515 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.384313 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:35 crc kubenswrapper[4676]: E1204 15:21:35.384483 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.384693 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:35 crc kubenswrapper[4676]: E1204 15:21:35.384749 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.385235 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:35 crc kubenswrapper[4676]: E1204 15:21:35.385424 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.385624 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:21:35 crc kubenswrapper[4676]: E1204 15:21:35.385792 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.407318 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.407375 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.407387 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.407411 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.407429 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.509975 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.510016 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.510033 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.510054 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.510070 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.612798 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.612855 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.612866 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.612885 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.612896 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.715249 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.715300 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.715312 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.715332 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.715345 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.727065 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.727105 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.727114 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.727130 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.727140 4676 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-04T15:21:35Z","lastTransitionTime":"2025-12-04T15:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.774188 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7"] Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.774814 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.777281 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.777410 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.777498 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.778996 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.788978 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acc3c245-2138-45d8-9660-1ff38907a4d9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.789037 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/acc3c245-2138-45d8-9660-1ff38907a4d9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.789063 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/acc3c245-2138-45d8-9660-1ff38907a4d9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.789095 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc3c245-2138-45d8-9660-1ff38907a4d9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.789110 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acc3c245-2138-45d8-9660-1ff38907a4d9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.890007 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acc3c245-2138-45d8-9660-1ff38907a4d9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.890083 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/acc3c245-2138-45d8-9660-1ff38907a4d9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.890110 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/acc3c245-2138-45d8-9660-1ff38907a4d9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.890135 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc3c245-2138-45d8-9660-1ff38907a4d9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.890151 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acc3c245-2138-45d8-9660-1ff38907a4d9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.890293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/acc3c245-2138-45d8-9660-1ff38907a4d9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.890293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/acc3c245-2138-45d8-9660-1ff38907a4d9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.891115 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/acc3c245-2138-45d8-9660-1ff38907a4d9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.897429 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/acc3c245-2138-45d8-9660-1ff38907a4d9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:35 crc kubenswrapper[4676]: I1204 15:21:35.909014 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/acc3c245-2138-45d8-9660-1ff38907a4d9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-tmtz7\" (UID: \"acc3c245-2138-45d8-9660-1ff38907a4d9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:36 crc kubenswrapper[4676]: I1204 15:21:36.093059 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" Dec 04 15:21:36 crc kubenswrapper[4676]: I1204 15:21:36.215469 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" event={"ID":"acc3c245-2138-45d8-9660-1ff38907a4d9","Type":"ContainerStarted","Data":"446559989eb577223c2d62b52764e165102d57807f2d2019da4cf7ef406cfab1"} Dec 04 15:21:36 crc kubenswrapper[4676]: I1204 15:21:36.384192 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:36 crc kubenswrapper[4676]: E1204 15:21:36.384610 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:37 crc kubenswrapper[4676]: I1204 15:21:37.220110 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" event={"ID":"acc3c245-2138-45d8-9660-1ff38907a4d9","Type":"ContainerStarted","Data":"61bdaf4573ca33f2c7664e0af83e97d9e5e1fba45affadacc1a14f85b1ec54e0"} Dec 04 15:21:37 crc kubenswrapper[4676]: I1204 15:21:37.383813 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:37 crc kubenswrapper[4676]: I1204 15:21:37.383873 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:37 crc kubenswrapper[4676]: I1204 15:21:37.383935 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:37 crc kubenswrapper[4676]: E1204 15:21:37.384020 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:37 crc kubenswrapper[4676]: E1204 15:21:37.384129 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:37 crc kubenswrapper[4676]: E1204 15:21:37.384224 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:38 crc kubenswrapper[4676]: I1204 15:21:38.383470 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:38 crc kubenswrapper[4676]: E1204 15:21:38.383670 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:39 crc kubenswrapper[4676]: I1204 15:21:39.384219 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:39 crc kubenswrapper[4676]: I1204 15:21:39.384267 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:39 crc kubenswrapper[4676]: E1204 15:21:39.384419 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:39 crc kubenswrapper[4676]: I1204 15:21:39.384230 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:39 crc kubenswrapper[4676]: E1204 15:21:39.384516 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:39 crc kubenswrapper[4676]: E1204 15:21:39.384677 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:40 crc kubenswrapper[4676]: I1204 15:21:40.383851 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:40 crc kubenswrapper[4676]: E1204 15:21:40.384063 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:40 crc kubenswrapper[4676]: I1204 15:21:40.443847 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:40 crc kubenswrapper[4676]: E1204 15:21:40.444083 4676 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:21:40 crc kubenswrapper[4676]: E1204 15:21:40.444152 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs podName:711742b9-8c03-4234-ae1d-4d7d3baa4217 nodeName:}" failed. No retries permitted until 2025-12-04 15:22:44.444135235 +0000 UTC m=+171.878805092 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs") pod "network-metrics-daemon-nsvkq" (UID: "711742b9-8c03-4234-ae1d-4d7d3baa4217") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 04 15:21:41 crc kubenswrapper[4676]: I1204 15:21:41.384489 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:41 crc kubenswrapper[4676]: E1204 15:21:41.384674 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:41 crc kubenswrapper[4676]: I1204 15:21:41.384522 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:41 crc kubenswrapper[4676]: E1204 15:21:41.384760 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:41 crc kubenswrapper[4676]: I1204 15:21:41.384498 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:41 crc kubenswrapper[4676]: E1204 15:21:41.384821 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:42 crc kubenswrapper[4676]: I1204 15:21:42.384140 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:42 crc kubenswrapper[4676]: E1204 15:21:42.384314 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:43 crc kubenswrapper[4676]: I1204 15:21:43.384317 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:43 crc kubenswrapper[4676]: I1204 15:21:43.384453 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:43 crc kubenswrapper[4676]: E1204 15:21:43.385553 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:43 crc kubenswrapper[4676]: I1204 15:21:43.385791 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:43 crc kubenswrapper[4676]: E1204 15:21:43.385854 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:43 crc kubenswrapper[4676]: E1204 15:21:43.385988 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:44 crc kubenswrapper[4676]: I1204 15:21:44.383415 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:44 crc kubenswrapper[4676]: E1204 15:21:44.383603 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:45 crc kubenswrapper[4676]: I1204 15:21:45.383954 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:45 crc kubenswrapper[4676]: I1204 15:21:45.383953 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:45 crc kubenswrapper[4676]: E1204 15:21:45.385042 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:45 crc kubenswrapper[4676]: I1204 15:21:45.383975 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:45 crc kubenswrapper[4676]: E1204 15:21:45.385174 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:45 crc kubenswrapper[4676]: E1204 15:21:45.384962 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:46 crc kubenswrapper[4676]: I1204 15:21:46.383394 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:46 crc kubenswrapper[4676]: E1204 15:21:46.383593 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:47 crc kubenswrapper[4676]: I1204 15:21:47.383524 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:47 crc kubenswrapper[4676]: I1204 15:21:47.383550 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:47 crc kubenswrapper[4676]: I1204 15:21:47.383752 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:47 crc kubenswrapper[4676]: E1204 15:21:47.384081 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:47 crc kubenswrapper[4676]: E1204 15:21:47.383741 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:47 crc kubenswrapper[4676]: E1204 15:21:47.383942 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:48 crc kubenswrapper[4676]: I1204 15:21:48.383799 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:48 crc kubenswrapper[4676]: E1204 15:21:48.384047 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:49 crc kubenswrapper[4676]: I1204 15:21:49.384369 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:49 crc kubenswrapper[4676]: I1204 15:21:49.384515 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:49 crc kubenswrapper[4676]: I1204 15:21:49.384592 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:49 crc kubenswrapper[4676]: E1204 15:21:49.384634 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:49 crc kubenswrapper[4676]: E1204 15:21:49.384725 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:49 crc kubenswrapper[4676]: E1204 15:21:49.385129 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:49 crc kubenswrapper[4676]: I1204 15:21:49.385528 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:21:49 crc kubenswrapper[4676]: E1204 15:21:49.385693 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" Dec 04 15:21:50 crc kubenswrapper[4676]: I1204 15:21:50.383768 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:50 crc kubenswrapper[4676]: E1204 15:21:50.383962 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:51 crc kubenswrapper[4676]: I1204 15:21:51.384030 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:51 crc kubenswrapper[4676]: I1204 15:21:51.384193 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:51 crc kubenswrapper[4676]: E1204 15:21:51.385181 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:51 crc kubenswrapper[4676]: E1204 15:21:51.384248 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:51 crc kubenswrapper[4676]: I1204 15:21:51.384292 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:51 crc kubenswrapper[4676]: E1204 15:21:51.385272 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:52 crc kubenswrapper[4676]: I1204 15:21:52.383253 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:52 crc kubenswrapper[4676]: E1204 15:21:52.383458 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:53 crc kubenswrapper[4676]: E1204 15:21:53.256655 4676 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 04 15:21:53 crc kubenswrapper[4676]: I1204 15:21:53.384370 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:53 crc kubenswrapper[4676]: I1204 15:21:53.384391 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:53 crc kubenswrapper[4676]: I1204 15:21:53.385685 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:53 crc kubenswrapper[4676]: E1204 15:21:53.385674 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:53 crc kubenswrapper[4676]: E1204 15:21:53.385792 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:53 crc kubenswrapper[4676]: E1204 15:21:53.385920 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:53 crc kubenswrapper[4676]: E1204 15:21:53.883988 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 15:21:54 crc kubenswrapper[4676]: I1204 15:21:54.384184 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:54 crc kubenswrapper[4676]: E1204 15:21:54.384418 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:55 crc kubenswrapper[4676]: I1204 15:21:55.383428 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:55 crc kubenswrapper[4676]: I1204 15:21:55.383543 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:55 crc kubenswrapper[4676]: E1204 15:21:55.383590 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:55 crc kubenswrapper[4676]: I1204 15:21:55.383695 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:55 crc kubenswrapper[4676]: E1204 15:21:55.383734 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:55 crc kubenswrapper[4676]: E1204 15:21:55.383802 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:56 crc kubenswrapper[4676]: I1204 15:21:56.383585 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:56 crc kubenswrapper[4676]: E1204 15:21:56.383767 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:57 crc kubenswrapper[4676]: I1204 15:21:57.383871 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:57 crc kubenswrapper[4676]: I1204 15:21:57.383990 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:57 crc kubenswrapper[4676]: I1204 15:21:57.384007 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:57 crc kubenswrapper[4676]: E1204 15:21:57.384099 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:21:57 crc kubenswrapper[4676]: E1204 15:21:57.384210 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:57 crc kubenswrapper[4676]: E1204 15:21:57.384304 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:58 crc kubenswrapper[4676]: I1204 15:21:58.383761 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:21:58 crc kubenswrapper[4676]: E1204 15:21:58.383961 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:21:58 crc kubenswrapper[4676]: E1204 15:21:58.885361 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 15:21:59 crc kubenswrapper[4676]: I1204 15:21:59.383711 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:21:59 crc kubenswrapper[4676]: E1204 15:21:59.383921 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:21:59 crc kubenswrapper[4676]: I1204 15:21:59.384184 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:21:59 crc kubenswrapper[4676]: E1204 15:21:59.384245 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:21:59 crc kubenswrapper[4676]: I1204 15:21:59.384490 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:21:59 crc kubenswrapper[4676]: E1204 15:21:59.384562 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:00 crc kubenswrapper[4676]: I1204 15:22:00.383236 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:00 crc kubenswrapper[4676]: E1204 15:22:00.383408 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:01 crc kubenswrapper[4676]: I1204 15:22:01.383732 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:01 crc kubenswrapper[4676]: I1204 15:22:01.383732 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:01 crc kubenswrapper[4676]: E1204 15:22:01.384500 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:01 crc kubenswrapper[4676]: I1204 15:22:01.383850 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:01 crc kubenswrapper[4676]: E1204 15:22:01.384615 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:01 crc kubenswrapper[4676]: E1204 15:22:01.384725 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.308744 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/1.log" Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.309333 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/0.log" Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.309426 4676 generic.go:334] "Generic (PLEG): container finished" podID="2a201486-d4f3-4677-adad-4028d94e0623" containerID="ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5" exitCode=1 Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.309491 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerDied","Data":"ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5"} Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.309592 4676 scope.go:117] "RemoveContainer" containerID="67c0764eb77b5e07f89a27a36277c2a3401db234b59452ac72888e36a5b7cc45" Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.310291 4676 scope.go:117] "RemoveContainer" containerID="ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5" Dec 04 15:22:02 crc kubenswrapper[4676]: E1204 15:22:02.310715 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-wch9m_openshift-multus(2a201486-d4f3-4677-adad-4028d94e0623)\"" pod="openshift-multus/multus-wch9m" podUID="2a201486-d4f3-4677-adad-4028d94e0623" Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.331506 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-tmtz7" podStartSLOduration=104.331483957 podStartE2EDuration="1m44.331483957s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:21:37.238976943 +0000 UTC m=+104.673646800" watchObservedRunningTime="2025-12-04 15:22:02.331483957 +0000 UTC m=+129.766153814" Dec 04 15:22:02 crc kubenswrapper[4676]: I1204 15:22:02.383323 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:02 crc kubenswrapper[4676]: E1204 15:22:02.383536 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:03 crc kubenswrapper[4676]: I1204 15:22:03.316383 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/1.log" Dec 04 15:22:03 crc kubenswrapper[4676]: I1204 15:22:03.383572 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:03 crc kubenswrapper[4676]: I1204 15:22:03.383708 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:03 crc kubenswrapper[4676]: I1204 15:22:03.385828 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:03 crc kubenswrapper[4676]: E1204 15:22:03.386028 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:03 crc kubenswrapper[4676]: E1204 15:22:03.386082 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:03 crc kubenswrapper[4676]: E1204 15:22:03.386267 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:03 crc kubenswrapper[4676]: I1204 15:22:03.386511 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:22:03 crc kubenswrapper[4676]: E1204 15:22:03.386862 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-wmbt2_openshift-ovn-kubernetes(f1ad0d70-0230-4055-a56e-d83c06c6e0b3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" Dec 04 15:22:03 crc kubenswrapper[4676]: E1204 15:22:03.887179 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 15:22:04 crc kubenswrapper[4676]: I1204 15:22:04.383742 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:04 crc kubenswrapper[4676]: E1204 15:22:04.383938 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:05 crc kubenswrapper[4676]: I1204 15:22:05.384250 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:05 crc kubenswrapper[4676]: I1204 15:22:05.384363 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:05 crc kubenswrapper[4676]: I1204 15:22:05.385118 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:05 crc kubenswrapper[4676]: E1204 15:22:05.385402 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:05 crc kubenswrapper[4676]: E1204 15:22:05.385745 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:05 crc kubenswrapper[4676]: E1204 15:22:05.386263 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:06 crc kubenswrapper[4676]: I1204 15:22:06.383651 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:06 crc kubenswrapper[4676]: E1204 15:22:06.384267 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:07 crc kubenswrapper[4676]: I1204 15:22:07.383965 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:07 crc kubenswrapper[4676]: I1204 15:22:07.383963 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:07 crc kubenswrapper[4676]: I1204 15:22:07.383993 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:07 crc kubenswrapper[4676]: E1204 15:22:07.384187 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:07 crc kubenswrapper[4676]: E1204 15:22:07.384273 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:07 crc kubenswrapper[4676]: E1204 15:22:07.384328 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:08 crc kubenswrapper[4676]: I1204 15:22:08.383615 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:08 crc kubenswrapper[4676]: E1204 15:22:08.384167 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:08 crc kubenswrapper[4676]: E1204 15:22:08.888872 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 15:22:09 crc kubenswrapper[4676]: I1204 15:22:09.384296 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:09 crc kubenswrapper[4676]: I1204 15:22:09.384436 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:09 crc kubenswrapper[4676]: I1204 15:22:09.384460 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:09 crc kubenswrapper[4676]: E1204 15:22:09.384625 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:09 crc kubenswrapper[4676]: E1204 15:22:09.384889 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:09 crc kubenswrapper[4676]: E1204 15:22:09.384894 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:10 crc kubenswrapper[4676]: I1204 15:22:10.383611 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:10 crc kubenswrapper[4676]: E1204 15:22:10.383806 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:11 crc kubenswrapper[4676]: I1204 15:22:11.383517 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:11 crc kubenswrapper[4676]: I1204 15:22:11.383608 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:11 crc kubenswrapper[4676]: E1204 15:22:11.383677 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:11 crc kubenswrapper[4676]: I1204 15:22:11.383608 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:11 crc kubenswrapper[4676]: E1204 15:22:11.383787 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:11 crc kubenswrapper[4676]: E1204 15:22:11.383850 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:12 crc kubenswrapper[4676]: I1204 15:22:12.383551 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:12 crc kubenswrapper[4676]: E1204 15:22:12.383731 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:13 crc kubenswrapper[4676]: I1204 15:22:13.383386 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:13 crc kubenswrapper[4676]: I1204 15:22:13.383432 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:13 crc kubenswrapper[4676]: I1204 15:22:13.383397 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:13 crc kubenswrapper[4676]: E1204 15:22:13.385025 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:13 crc kubenswrapper[4676]: E1204 15:22:13.385233 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:13 crc kubenswrapper[4676]: E1204 15:22:13.385401 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:13 crc kubenswrapper[4676]: E1204 15:22:13.890170 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 15:22:14 crc kubenswrapper[4676]: I1204 15:22:14.383637 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:14 crc kubenswrapper[4676]: E1204 15:22:14.383811 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:15 crc kubenswrapper[4676]: I1204 15:22:15.383801 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:15 crc kubenswrapper[4676]: I1204 15:22:15.383838 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:15 crc kubenswrapper[4676]: E1204 15:22:15.384005 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:15 crc kubenswrapper[4676]: I1204 15:22:15.384105 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:15 crc kubenswrapper[4676]: E1204 15:22:15.384411 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:15 crc kubenswrapper[4676]: I1204 15:22:15.384467 4676 scope.go:117] "RemoveContainer" containerID="ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5" Dec 04 15:22:15 crc kubenswrapper[4676]: E1204 15:22:15.384543 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:15 crc kubenswrapper[4676]: I1204 15:22:15.385825 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:22:16 crc kubenswrapper[4676]: I1204 15:22:16.370288 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/1.log" Dec 04 15:22:16 crc kubenswrapper[4676]: I1204 15:22:16.370399 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerStarted","Data":"8088b0e22f4f19774d73bca1f606c4eb2a1295199b115b5884111164ee215ff3"} Dec 04 15:22:16 crc kubenswrapper[4676]: I1204 15:22:16.380250 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/3.log" Dec 04 15:22:16 crc kubenswrapper[4676]: I1204 15:22:16.383180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerStarted","Data":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} Dec 04 15:22:16 crc kubenswrapper[4676]: I1204 15:22:16.383256 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:16 crc kubenswrapper[4676]: I1204 15:22:16.384042 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:22:16 crc kubenswrapper[4676]: E1204 15:22:16.384296 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:16 crc kubenswrapper[4676]: I1204 15:22:16.441016 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podStartSLOduration=117.440968455 podStartE2EDuration="1m57.440968455s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:16.440024249 +0000 UTC m=+143.874694106" watchObservedRunningTime="2025-12-04 15:22:16.440968455 +0000 UTC m=+143.875638312" Dec 04 15:22:17 crc kubenswrapper[4676]: I1204 15:22:17.385236 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:17 crc kubenswrapper[4676]: I1204 15:22:17.385268 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:17 crc kubenswrapper[4676]: E1204 15:22:17.385443 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:17 crc kubenswrapper[4676]: I1204 15:22:17.385490 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:17 crc kubenswrapper[4676]: E1204 15:22:17.385567 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:17 crc kubenswrapper[4676]: E1204 15:22:17.385648 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:17 crc kubenswrapper[4676]: I1204 15:22:17.930046 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nsvkq"] Dec 04 15:22:17 crc kubenswrapper[4676]: I1204 15:22:17.930255 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:17 crc kubenswrapper[4676]: E1204 15:22:17.930400 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:18 crc kubenswrapper[4676]: E1204 15:22:18.892517 4676 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 04 15:22:19 crc kubenswrapper[4676]: I1204 15:22:19.383633 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:19 crc kubenswrapper[4676]: I1204 15:22:19.383681 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:19 crc kubenswrapper[4676]: I1204 15:22:19.383706 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:19 crc kubenswrapper[4676]: I1204 15:22:19.383652 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:19 crc kubenswrapper[4676]: E1204 15:22:19.383793 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:19 crc kubenswrapper[4676]: E1204 15:22:19.383857 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:19 crc kubenswrapper[4676]: E1204 15:22:19.383970 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:19 crc kubenswrapper[4676]: E1204 15:22:19.384093 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:21 crc kubenswrapper[4676]: I1204 15:22:21.383719 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:21 crc kubenswrapper[4676]: I1204 15:22:21.383719 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:21 crc kubenswrapper[4676]: E1204 15:22:21.384476 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:21 crc kubenswrapper[4676]: I1204 15:22:21.383777 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:21 crc kubenswrapper[4676]: E1204 15:22:21.384587 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:21 crc kubenswrapper[4676]: E1204 15:22:21.384306 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:21 crc kubenswrapper[4676]: I1204 15:22:21.383861 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:21 crc kubenswrapper[4676]: E1204 15:22:21.384717 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:23 crc kubenswrapper[4676]: I1204 15:22:23.384183 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:23 crc kubenswrapper[4676]: I1204 15:22:23.384206 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:23 crc kubenswrapper[4676]: I1204 15:22:23.384206 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:23 crc kubenswrapper[4676]: I1204 15:22:23.384230 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:23 crc kubenswrapper[4676]: E1204 15:22:23.386864 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nsvkq" podUID="711742b9-8c03-4234-ae1d-4d7d3baa4217" Dec 04 15:22:23 crc kubenswrapper[4676]: E1204 15:22:23.387247 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 04 15:22:23 crc kubenswrapper[4676]: E1204 15:22:23.398413 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 04 15:22:23 crc kubenswrapper[4676]: E1204 15:22:23.398536 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.267325 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268211 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:24:27.268169035 +0000 UTC m=+274.702838892 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.268397 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.268524 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.268602 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.268625 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268626 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268750 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268760 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268778 4676 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268791 4676 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268809 4676 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268691 4676 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268771 4676 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.268858 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-04 15:24:27.268838794 +0000 UTC m=+274.703508651 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.269056 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-04 15:24:27.269044119 +0000 UTC m=+274.703713966 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.269073 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:24:27.26906647 +0000 UTC m=+274.703736327 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: E1204 15:22:25.269095 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-04 15:24:27.269090241 +0000 UTC m=+274.703760098 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.384254 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.384309 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.384254 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.384422 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.388226 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.388822 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.388932 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.389361 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.389831 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 15:22:25 crc kubenswrapper[4676]: I1204 15:22:25.389975 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 15:22:26 crc kubenswrapper[4676]: I1204 15:22:26.968408 4676 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.014556 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8k7hs"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.015431 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dlhc6"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.015717 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.016046 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.016328 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.016613 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.018390 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jwhjf"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.018847 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.020628 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.021035 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.021188 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.021502 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.021777 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.022245 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.023039 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.023205 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.024597 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.027388 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.027732 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.027943 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.028204 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.028263 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.028942 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.029488 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.032065 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.032318 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.032352 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.032431 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.032590 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.032873 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.033068 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.033726 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.034489 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.038529 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.038794 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.039205 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.040876 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.041576 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.050778 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.053614 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.054453 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.055529 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.055792 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.056584 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.056775 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.056820 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.056873 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.056973 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057058 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057136 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057241 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057332 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057413 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057433 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057682 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.057967 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059211 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059505 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjdxf\" (UniqueName: \"kubernetes.io/projected/fdf10486-0860-4dad-984e-d82daaac8ecd-kube-api-access-cjdxf\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059550 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3e5dc91-43ef-4a63-9898-504dfd9b4398-audit-dir\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059590 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3162c38f-2d77-4c34-a890-a8f321e1eebc-auth-proxy-config\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059619 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-audit-policies\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059641 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059663 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8q7b\" (UniqueName: \"kubernetes.io/projected/591b399c-21b2-4c6f-ab3a-d424df670c0b-kube-api-access-t8q7b\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059690 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059719 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f9c064-9769-41c0-8936-340f895bc36d-config\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059743 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdf10486-0860-4dad-984e-d82daaac8ecd-serving-cert\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059815 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3162c38f-2d77-4c34-a890-a8f321e1eebc-config\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059841 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-encryption-config\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059888 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-service-ca-bundle\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059928 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmd94\" (UniqueName: \"kubernetes.io/projected/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-kube-api-access-dmd94\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059964 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7lm\" (UniqueName: \"kubernetes.io/projected/7bdebf26-30a2-44be-88b4-24d230d01708-kube-api-access-ss7lm\") pod \"cluster-samples-operator-665b6dd947-kh68m\" (UID: \"7bdebf26-30a2-44be-88b4-24d230d01708\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.059991 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060032 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f22hw\" (UniqueName: \"kubernetes.io/projected/d3e5dc91-43ef-4a63-9898-504dfd9b4398-kube-api-access-f22hw\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060059 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060085 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-etcd-client\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060107 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7r4w\" (UniqueName: \"kubernetes.io/projected/76f9c064-9769-41c0-8936-340f895bc36d-kube-api-access-x7r4w\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060147 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060169 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bdebf26-30a2-44be-88b4-24d230d01708-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kh68m\" (UID: \"7bdebf26-30a2-44be-88b4-24d230d01708\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-serving-cert\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060218 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3162c38f-2d77-4c34-a890-a8f321e1eebc-machine-approver-tls\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060264 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8d99\" (UniqueName: \"kubernetes.io/projected/3162c38f-2d77-4c34-a890-a8f321e1eebc-kube-api-access-x8d99\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060291 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-config\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060314 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f9c064-9769-41c0-8936-340f895bc36d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060356 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76f9c064-9769-41c0-8936-340f895bc36d-images\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060380 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fdf10486-0860-4dad-984e-d82daaac8ecd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060406 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a08b22ef-20e1-4a1c-bec4-e35311bf926b-serving-cert\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060435 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hndtg\" (UniqueName: \"kubernetes.io/projected/a08b22ef-20e1-4a1c-bec4-e35311bf926b-kube-api-access-hndtg\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060467 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060545 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/591b399c-21b2-4c6f-ab3a-d424df670c0b-serving-cert\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060575 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-config\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.060595 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-client-ca\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.061293 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.062543 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.062864 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qlskj"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.063376 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.063550 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-qbw9s"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.063748 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.064352 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.064454 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.064625 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.077374 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.079559 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.079892 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.080104 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.083239 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.083344 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.091264 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.100531 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.104728 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-mtj84"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.105656 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.106046 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.106178 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.106661 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-675c2"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.106943 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.107172 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qbw9s" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.107371 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x25bq"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.108239 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.108582 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.109693 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.112860 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.113391 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lfwj6"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.113690 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.114031 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.114506 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k7tn2"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.114959 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.115325 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.115743 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.118500 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.119187 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.119297 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.119544 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cts56"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.119749 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.119811 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.120672 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.120715 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.121106 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.121303 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.121451 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.121878 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.122033 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.122230 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.122278 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.122567 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.122708 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.122988 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.123181 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.123347 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.123633 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.124141 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.124300 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.124641 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.124818 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125136 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125418 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125501 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125549 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125587 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125698 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125859 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.125963 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.126470 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.122582 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.126788 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.126728 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.129002 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.129217 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.129681 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.130404 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.130886 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.131196 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.138962 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.139421 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.140243 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.140499 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.140567 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.140806 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.140971 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.141966 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.142076 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.143036 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.143178 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.143376 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.143520 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.147383 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.149162 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.159313 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177093 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-config\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177170 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-client-ca\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177237 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a735889f-51fc-49e1-8756-4f9dc2c05d94-serving-cert\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177277 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57fzv\" (UniqueName: \"kubernetes.io/projected/a735889f-51fc-49e1-8756-4f9dc2c05d94-kube-api-access-57fzv\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177352 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/685f9e11-cab9-4f06-bcfe-9931c77f4d23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177392 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177443 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjdxf\" (UniqueName: \"kubernetes.io/projected/fdf10486-0860-4dad-984e-d82daaac8ecd-kube-api-access-cjdxf\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-config\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177521 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3e5dc91-43ef-4a63-9898-504dfd9b4398-audit-dir\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-client-ca\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177620 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/685f9e11-cab9-4f06-bcfe-9931c77f4d23-config\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177655 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-audit-policies\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177683 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177722 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8q7b\" (UniqueName: \"kubernetes.io/projected/591b399c-21b2-4c6f-ab3a-d424df670c0b-kube-api-access-t8q7b\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177773 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3162c38f-2d77-4c34-a890-a8f321e1eebc-auth-proxy-config\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177823 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f9c064-9769-41c0-8936-340f895bc36d-config\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177867 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.177960 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdf10486-0860-4dad-984e-d82daaac8ecd-serving-cert\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178035 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l62zs\" (UniqueName: \"kubernetes.io/projected/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-kube-api-access-l62zs\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178128 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3162c38f-2d77-4c34-a890-a8f321e1eebc-config\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178181 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-encryption-config\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178237 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-service-ca-bundle\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178289 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmd94\" (UniqueName: \"kubernetes.io/projected/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-kube-api-access-dmd94\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178345 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178410 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7lm\" (UniqueName: \"kubernetes.io/projected/7bdebf26-30a2-44be-88b4-24d230d01708-kube-api-access-ss7lm\") pod \"cluster-samples-operator-665b6dd947-kh68m\" (UID: \"7bdebf26-30a2-44be-88b4-24d230d01708\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178478 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178638 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f22hw\" (UniqueName: \"kubernetes.io/projected/d3e5dc91-43ef-4a63-9898-504dfd9b4398-kube-api-access-f22hw\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178682 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-etcd-client\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178716 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7r4w\" (UniqueName: \"kubernetes.io/projected/76f9c064-9769-41c0-8936-340f895bc36d-kube-api-access-x7r4w\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178753 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178787 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178816 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9l24\" (UniqueName: \"kubernetes.io/projected/662295c5-dfd2-4536-bd74-8d5624100ea5-kube-api-access-q9l24\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178853 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-serving-cert\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178879 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3162c38f-2d77-4c34-a890-a8f321e1eebc-machine-approver-tls\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178961 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bdebf26-30a2-44be-88b4-24d230d01708-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kh68m\" (UID: \"7bdebf26-30a2-44be-88b4-24d230d01708\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.178992 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/662295c5-dfd2-4536-bd74-8d5624100ea5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179022 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-config\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179043 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8d99\" (UniqueName: \"kubernetes.io/projected/3162c38f-2d77-4c34-a890-a8f321e1eebc-kube-api-access-x8d99\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179077 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/662295c5-dfd2-4536-bd74-8d5624100ea5-images\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179115 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f9c064-9769-41c0-8936-340f895bc36d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179149 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76f9c064-9769-41c0-8936-340f895bc36d-images\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179187 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fdf10486-0860-4dad-984e-d82daaac8ecd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179221 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a08b22ef-20e1-4a1c-bec4-e35311bf926b-serving-cert\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179251 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/685f9e11-cab9-4f06-bcfe-9931c77f4d23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179285 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hndtg\" (UniqueName: \"kubernetes.io/projected/a08b22ef-20e1-4a1c-bec4-e35311bf926b-kube-api-access-hndtg\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179325 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/591b399c-21b2-4c6f-ab3a-d424df670c0b-serving-cert\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179433 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.179486 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/662295c5-dfd2-4536-bd74-8d5624100ea5-proxy-tls\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.182136 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.182771 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.183173 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.183321 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d3e5dc91-43ef-4a63-9898-504dfd9b4398-audit-dir\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.184839 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3162c38f-2d77-4c34-a890-a8f321e1eebc-auth-proxy-config\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.185350 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-client-ca\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.188220 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-audit-policies\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.189529 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-config\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.189691 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.189381 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f9c064-9769-41c0-8936-340f895bc36d-config\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.201188 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-config\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.210286 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.211823 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.215031 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-config\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.216812 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.216981 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d3e5dc91-43ef-4a63-9898-504dfd9b4398-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.217209 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.218949 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/76f9c064-9769-41c0-8936-340f895bc36d-images\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.221582 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3162c38f-2d77-4c34-a890-a8f321e1eebc-config\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.222181 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.222548 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.222567 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a08b22ef-20e1-4a1c-bec4-e35311bf926b-service-ca-bundle\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.222690 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.223035 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.223188 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nvsfq"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.223748 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/fdf10486-0860-4dad-984e-d82daaac8ecd-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.223771 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.224343 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.229187 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-encryption-config\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.229949 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.230108 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.230244 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.230475 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4h6zp"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.231006 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.231106 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.231009 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-nrpqk"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.231786 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.232438 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.232721 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.233893 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-etcd-client\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.234393 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rbngc"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.235771 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.236574 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.237205 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.237573 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4627g"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.238057 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.238659 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/3162c38f-2d77-4c34-a890-a8f321e1eebc-machine-approver-tls\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.238808 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a08b22ef-20e1-4a1c-bec4-e35311bf926b-serving-cert\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.239492 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3e5dc91-43ef-4a63-9898-504dfd9b4398-serving-cert\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.239686 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.240431 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.240742 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.241199 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.241969 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jwhjf"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.243505 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8k7hs"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.245765 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/76f9c064-9769-41c0-8936-340f895bc36d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.246189 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.246430 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.247657 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.250764 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fdf10486-0860-4dad-984e-d82daaac8ecd-serving-cert\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.251124 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dlhc6"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.252292 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mtj84"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.254000 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.254840 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bdebf26-30a2-44be-88b4-24d230d01708-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-kh68m\" (UID: \"7bdebf26-30a2-44be-88b4-24d230d01708\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.254969 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/591b399c-21b2-4c6f-ab3a-d424df670c0b-serving-cert\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.256156 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qbw9s"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.258218 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.258701 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.260406 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-675c2"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.265850 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.267719 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.268081 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.269830 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nvsfq"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.271227 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.272165 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4hnbc"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.273541 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.274137 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qlskj"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.280254 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cts56"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.282656 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.281577 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/685f9e11-cab9-4f06-bcfe-9931c77f4d23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.283081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.283363 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/662295c5-dfd2-4536-bd74-8d5624100ea5-proxy-tls\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.283540 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a735889f-51fc-49e1-8756-4f9dc2c05d94-serving-cert\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.283679 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57fzv\" (UniqueName: \"kubernetes.io/projected/a735889f-51fc-49e1-8756-4f9dc2c05d94-kube-api-access-57fzv\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.283821 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/685f9e11-cab9-4f06-bcfe-9931c77f4d23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.283986 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.284108 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-config\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.284205 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-client-ca\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.284319 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/685f9e11-cab9-4f06-bcfe-9931c77f4d23-config\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.284477 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l62zs\" (UniqueName: \"kubernetes.io/projected/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-kube-api-access-l62zs\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.280776 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.284754 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.284953 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9l24\" (UniqueName: \"kubernetes.io/projected/662295c5-dfd2-4536-bd74-8d5624100ea5-kube-api-access-q9l24\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.285096 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/662295c5-dfd2-4536-bd74-8d5624100ea5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.285212 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/662295c5-dfd2-4536-bd74-8d5624100ea5-images\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.285778 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-client-ca\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.285827 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/685f9e11-cab9-4f06-bcfe-9931c77f4d23-config\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.286336 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-config\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.285097 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.287171 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/662295c5-dfd2-4536-bd74-8d5624100ea5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.287462 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/685f9e11-cab9-4f06-bcfe-9931c77f4d23-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.290842 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.291624 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a735889f-51fc-49e1-8756-4f9dc2c05d94-serving-cert\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.291704 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.293939 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4h6zp"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.295522 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x25bq"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.297600 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5j6kp"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.300018 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.302799 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wk9bw"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.303097 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.306547 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.306865 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.315715 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.317968 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.319884 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.328150 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.330954 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.333373 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k7tn2"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.334960 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.336057 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.336447 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.337747 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lfwj6"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.339384 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.340598 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5j6kp"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.342899 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.349276 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/662295c5-dfd2-4536-bd74-8d5624100ea5-images\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.356428 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4627g"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.356945 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rbngc"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.358440 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wk9bw"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.359633 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.360193 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.365664 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.370588 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4hnbc"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.371827 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-b6qzl"] Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.372600 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.382372 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.398655 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.419656 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.438779 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.461206 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.479708 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.498377 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.519179 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.538959 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.559765 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.579092 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.600211 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.619765 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.638712 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.658981 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.678941 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.699503 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.719211 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.739513 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.765382 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.779086 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.819329 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.840010 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.858578 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.878999 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.889157 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/662295c5-dfd2-4536-bd74-8d5624100ea5-proxy-tls\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.899737 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.918897 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.939699 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 15:22:27 crc kubenswrapper[4676]: I1204 15:22:27.959437 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.016857 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7lm\" (UniqueName: \"kubernetes.io/projected/7bdebf26-30a2-44be-88b4-24d230d01708-kube-api-access-ss7lm\") pod \"cluster-samples-operator-665b6dd947-kh68m\" (UID: \"7bdebf26-30a2-44be-88b4-24d230d01708\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.034606 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8q7b\" (UniqueName: \"kubernetes.io/projected/591b399c-21b2-4c6f-ab3a-d424df670c0b-kube-api-access-t8q7b\") pod \"controller-manager-879f6c89f-dlhc6\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.057550 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjdxf\" (UniqueName: \"kubernetes.io/projected/fdf10486-0860-4dad-984e-d82daaac8ecd-kube-api-access-cjdxf\") pod \"openshift-config-operator-7777fb866f-nr6vs\" (UID: \"fdf10486-0860-4dad-984e-d82daaac8ecd\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.074162 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7r4w\" (UniqueName: \"kubernetes.io/projected/76f9c064-9769-41c0-8936-340f895bc36d-kube-api-access-x7r4w\") pod \"machine-api-operator-5694c8668f-8k7hs\" (UID: \"76f9c064-9769-41c0-8936-340f895bc36d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.095412 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hndtg\" (UniqueName: \"kubernetes.io/projected/a08b22ef-20e1-4a1c-bec4-e35311bf926b-kube-api-access-hndtg\") pod \"authentication-operator-69f744f599-jwhjf\" (UID: \"a08b22ef-20e1-4a1c-bec4-e35311bf926b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.114307 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f22hw\" (UniqueName: \"kubernetes.io/projected/d3e5dc91-43ef-4a63-9898-504dfd9b4398-kube-api-access-f22hw\") pod \"apiserver-7bbb656c7d-2rvct\" (UID: \"d3e5dc91-43ef-4a63-9898-504dfd9b4398\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.133896 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmd94\" (UniqueName: \"kubernetes.io/projected/b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290-kube-api-access-dmd94\") pod \"openshift-apiserver-operator-796bbdcf4f-8t6kz\" (UID: \"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.154580 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8d99\" (UniqueName: \"kubernetes.io/projected/3162c38f-2d77-4c34-a890-a8f321e1eebc-kube-api-access-x8d99\") pod \"machine-approver-56656f9798-fz52v\" (UID: \"3162c38f-2d77-4c34-a890-a8f321e1eebc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.158879 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.172830 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.179969 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.198840 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.214488 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.220043 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.236641 4676 request.go:700] Waited for 1.011699299s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.239205 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.249209 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.259844 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.262280 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.279252 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.299396 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.313072 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.319617 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.330716 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.339549 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 15:22:28 crc kubenswrapper[4676]: W1204 15:22:28.347473 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3162c38f_2d77_4c34_a890_a8f321e1eebc.slice/crio-867dee2471a34f61832213635bbc626008664e0098b33ecd5a28f4a428fc08da WatchSource:0}: Error finding container 867dee2471a34f61832213635bbc626008664e0098b33ecd5a28f4a428fc08da: Status 404 returned error can't find the container with id 867dee2471a34f61832213635bbc626008664e0098b33ecd5a28f4a428fc08da Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.359523 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.384512 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.388499 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.402719 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.421320 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.423207 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.442690 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.460531 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.467080 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs"] Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.470564 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" event={"ID":"3162c38f-2d77-4c34-a890-a8f321e1eebc","Type":"ContainerStarted","Data":"867dee2471a34f61832213635bbc626008664e0098b33ecd5a28f4a428fc08da"} Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.481066 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.499809 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.519479 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.539541 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.562599 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.581160 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.582856 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m"] Dec 04 15:22:28 crc kubenswrapper[4676]: W1204 15:22:28.585564 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf10486_0860_4dad_984e_d82daaac8ecd.slice/crio-0c5f44d31ef85de68d6d0072e820669e4f9796c8bb55420746603494f35fec96 WatchSource:0}: Error finding container 0c5f44d31ef85de68d6d0072e820669e4f9796c8bb55420746603494f35fec96: Status 404 returned error can't find the container with id 0c5f44d31ef85de68d6d0072e820669e4f9796c8bb55420746603494f35fec96 Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.598665 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.604578 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dlhc6"] Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.619211 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.639473 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.645579 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-8k7hs"] Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.658575 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.680754 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.695583 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct"] Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.700058 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.721559 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.737659 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jwhjf"] Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.738883 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.758601 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.763813 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz"] Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.779089 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.806095 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.819032 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.839809 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.859412 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.879402 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.899232 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.919628 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.938965 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.958986 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.978613 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 15:22:28 crc kubenswrapper[4676]: I1204 15:22:28.999255 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.019192 4676 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.040040 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.096048 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.114206 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l62zs\" (UniqueName: \"kubernetes.io/projected/e9bbf7af-9cc9-4dec-a933-dff6683aa16a-kube-api-access-l62zs\") pod \"cluster-image-registry-operator-dc59b4c8b-rqcz2\" (UID: \"e9bbf7af-9cc9-4dec-a933-dff6683aa16a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.134111 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9l24\" (UniqueName: \"kubernetes.io/projected/662295c5-dfd2-4536-bd74-8d5624100ea5-kube-api-access-q9l24\") pod \"machine-config-operator-74547568cd-cts56\" (UID: \"662295c5-dfd2-4536-bd74-8d5624100ea5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.150832 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.158462 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.160499 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/685f9e11-cab9-4f06-bcfe-9931c77f4d23-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-f7kvn\" (UID: \"685f9e11-cab9-4f06-bcfe-9931c77f4d23\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.179338 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.198967 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.219771 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.234137 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.236625 4676 request.go:700] Waited for 1.92793848s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.238748 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.260229 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.263960 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.279616 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.299587 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.319341 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.339887 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.476995 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" event={"ID":"fdf10486-0860-4dad-984e-d82daaac8ecd","Type":"ContainerStarted","Data":"0c5f44d31ef85de68d6d0072e820669e4f9796c8bb55420746603494f35fec96"} Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.478033 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" event={"ID":"591b399c-21b2-4c6f-ab3a-d424df670c0b","Type":"ContainerStarted","Data":"52bf81443f7bd00b4e502eb20eb76338c3efba6f8e1ec377fdb8a221641e77bd"} Dec 04 15:22:29 crc kubenswrapper[4676]: I1204 15:22:29.478850 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" event={"ID":"76f9c064-9769-41c0-8936-340f895bc36d","Type":"ContainerStarted","Data":"438f4826eba499256456eace0f5d4c9d1bdec3fc79bb037eaed3101733954d31"} Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.391601 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57fzv\" (UniqueName: \"kubernetes.io/projected/a735889f-51fc-49e1-8756-4f9dc2c05d94-kube-api-access-57fzv\") pod \"route-controller-manager-6576b87f9c-w9pnw\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.393276 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8742ff93-db20-4d4e-84fa-a9c4276643ea-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.393353 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.393391 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nmdm\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-kube-api-access-5nmdm\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.393680 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-trusted-ca\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.393735 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-tls\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: E1204 15:22:30.394891 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:30.894870183 +0000 UTC m=+158.329540040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.395073 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.399993 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-certificates\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.400231 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8742ff93-db20-4d4e-84fa-a9c4276643ea-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.400659 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-bound-sa-token\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: W1204 15:22:30.413838 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92bcd16_c0e2_4cb6_8a6b_63aa9d09e290.slice/crio-1eeccdb8053472d10f99a48822d73e19d68050a745728d71852061774a4d73f4 WatchSource:0}: Error finding container 1eeccdb8053472d10f99a48822d73e19d68050a745728d71852061774a4d73f4: Status 404 returned error can't find the container with id 1eeccdb8053472d10f99a48822d73e19d68050a745728d71852061774a4d73f4 Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.504773 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" event={"ID":"3162c38f-2d77-4c34-a890-a8f321e1eebc","Type":"ContainerStarted","Data":"75f1c7d2820337897c50ef368c48a6df4e5f55484ca053e9f9c11b85f577326e"} Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.505506 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.505765 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92pff\" (UniqueName: \"kubernetes.io/projected/f08aef24-f00f-43da-8ac1-79def39914ce-kube-api-access-92pff\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.505884 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/29205e6d-74be-4a99-b92d-50152cb21845-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5hd4h\" (UID: \"29205e6d-74be-4a99-b92d-50152cb21845\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506010 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506036 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ae863415-6074-4ce2-9e25-8c0705ed1e80-node-pullsecrets\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506069 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed5477e6-0f8c-457f-a314-6a8263aa89ac-profile-collector-cert\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506105 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87t45\" (UniqueName: \"kubernetes.io/projected/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-kube-api-access-87t45\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506121 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-audit\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506172 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506205 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cxxn\" (UniqueName: \"kubernetes.io/projected/64c8acea-9343-42d1-84cc-168d575e30a5-kube-api-access-5cxxn\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506242 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf598\" (UniqueName: \"kubernetes.io/projected/d35d3a3f-f614-45fa-a59a-e5cefa471321-kube-api-access-mf598\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506314 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-oauth-config\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506342 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506362 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99w2k\" (UniqueName: \"kubernetes.io/projected/559634f6-983d-4ae2-959e-8b54abc1326d-kube-api-access-99w2k\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: E1204 15:22:30.506422 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:31.006380998 +0000 UTC m=+158.441050905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506492 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-certificates\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506553 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506589 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-service-ca\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506616 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bbe37e1-bbb1-4298-8427-f8c233470593-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506644 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506663 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-serving-cert\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506687 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-trusted-ca-bundle\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506714 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb280ecc-1666-4a9a-a2b3-910b09de7474-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506749 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506796 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506842 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71b79282-23b9-4bfd-b5b9-446f82131905-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506874 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506898 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7cv\" (UniqueName: \"kubernetes.io/projected/eb280ecc-1666-4a9a-a2b3-910b09de7474-kube-api-access-bn7cv\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506946 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64c8acea-9343-42d1-84cc-168d575e30a5-config\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506972 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/86926fca-c917-498b-a3f3-7315ec1e5370-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.506998 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bbe37e1-bbb1-4298-8427-f8c233470593-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507021 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-config\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507043 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb280ecc-1666-4a9a-a2b3-910b09de7474-metrics-tls\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507069 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c8acea-9343-42d1-84cc-168d575e30a5-trusted-ca\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507131 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-config\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507169 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8742ff93-db20-4d4e-84fa-a9c4276643ea-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507191 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08aef24-f00f-43da-8ac1-79def39914ce-serving-cert\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507219 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/559634f6-983d-4ae2-959e-8b54abc1326d-apiservice-cert\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507264 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nmdm\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-kube-api-access-5nmdm\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507288 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-ca\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507309 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae863415-6074-4ce2-9e25-8c0705ed1e80-audit-dir\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507360 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-etcd-client\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507392 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-trusted-ca\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507411 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-serving-cert\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507432 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-image-import-ca\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507458 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-tls\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507513 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4k2\" (UniqueName: \"kubernetes.io/projected/ed5477e6-0f8c-457f-a314-6a8263aa89ac-kube-api-access-gr4k2\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507544 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qlhf\" (UniqueName: \"kubernetes.io/projected/1348ed48-644b-49f3-b674-92cd4e39d1ec-kube-api-access-7qlhf\") pod \"downloads-7954f5f757-qbw9s\" (UID: \"1348ed48-644b-49f3-b674-92cd4e39d1ec\") " pod="openshift-console/downloads-7954f5f757-qbw9s" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507661 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/559634f6-983d-4ae2-959e-8b54abc1326d-tmpfs\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.507711 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-service-ca\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.508607 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4cbh\" (UniqueName: \"kubernetes.io/projected/ae863415-6074-4ce2-9e25-8c0705ed1e80-kube-api-access-j4cbh\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.508647 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/86926fca-c917-498b-a3f3-7315ec1e5370-srv-cert\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.508676 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.508737 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.508773 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.508809 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8742ff93-db20-4d4e-84fa-a9c4276643ea-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.508850 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-certificates\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509148 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed5477e6-0f8c-457f-a314-6a8263aa89ac-srv-cert\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509216 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509251 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-dir\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509362 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-encryption-config\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509518 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-bound-sa-token\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509604 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-client\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509637 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8742ff93-db20-4d4e-84fa-a9c4276643ea-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509801 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsp68\" (UniqueName: \"kubernetes.io/projected/a75359a0-583e-4732-a043-4088c2ca0910-kube-api-access-qsp68\") pod \"migrator-59844c95c7-nltr4\" (UID: \"a75359a0-583e-4732-a043-4088c2ca0910\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509930 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzdj\" (UniqueName: \"kubernetes.io/projected/86926fca-c917-498b-a3f3-7315ec1e5370-kube-api-access-5fzdj\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.509974 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-etcd-serving-ca\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.510015 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/559634f6-983d-4ae2-959e-8b54abc1326d-webhook-cert\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.510050 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-config\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.510736 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb280ecc-1666-4a9a-a2b3-910b09de7474-trusted-ca\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.510983 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-oauth-serving-cert\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.511082 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64c8acea-9343-42d1-84cc-168d575e30a5-serving-cert\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.511149 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71b79282-23b9-4bfd-b5b9-446f82131905-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.511182 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkcj\" (UniqueName: \"kubernetes.io/projected/6bbe37e1-bbb1-4298-8427-f8c233470593-kube-api-access-tnkcj\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.511263 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-policies\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.511430 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fhzk\" (UniqueName: \"kubernetes.io/projected/29205e6d-74be-4a99-b92d-50152cb21845-kube-api-access-6fhzk\") pod \"control-plane-machine-set-operator-78cbb6b69f-5hd4h\" (UID: \"29205e6d-74be-4a99-b92d-50152cb21845\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.511458 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b79282-23b9-4bfd-b5b9-446f82131905-config\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.512743 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-trusted-ca\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.513026 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" event={"ID":"7bdebf26-30a2-44be-88b4-24d230d01708","Type":"ContainerStarted","Data":"8540d5d3ab369613f286105a2945eb7e3e3f35b31b923c41776a0457e075af9f"} Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.516540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-tls\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.518395 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" event={"ID":"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290","Type":"ContainerStarted","Data":"1eeccdb8053472d10f99a48822d73e19d68050a745728d71852061774a4d73f4"} Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.521274 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8742ff93-db20-4d4e-84fa-a9c4276643ea-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.525044 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" event={"ID":"d3e5dc91-43ef-4a63-9898-504dfd9b4398","Type":"ContainerStarted","Data":"b2d16fbcf4095b8036f9c8a3b307f65d6f705bb4ac9ecdb20f89107f22088a8e"} Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.527678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" event={"ID":"a08b22ef-20e1-4a1c-bec4-e35311bf926b","Type":"ContainerStarted","Data":"5a2fda57e2b95752314cc4adc7dbd845c5a9eb4c9c7dca82d268e5517985114e"} Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.548677 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nmdm\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-kube-api-access-5nmdm\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.561190 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-bound-sa-token\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.646777 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/29205e6d-74be-4a99-b92d-50152cb21845-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5hd4h\" (UID: \"29205e6d-74be-4a99-b92d-50152cb21845\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647222 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-stats-auth\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647260 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-mountpoint-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647292 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647319 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647347 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ae863415-6074-4ce2-9e25-8c0705ed1e80-node-pullsecrets\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647375 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed5477e6-0f8c-457f-a314-6a8263aa89ac-profile-collector-cert\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647416 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87t45\" (UniqueName: \"kubernetes.io/projected/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-kube-api-access-87t45\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647430 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-audit\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647454 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdjw9\" (UniqueName: \"kubernetes.io/projected/e89c9638-4420-465f-b9f4-0afe798f1610-kube-api-access-zdjw9\") pod \"package-server-manager-789f6589d5-g7j5k\" (UID: \"e89c9638-4420-465f-b9f4-0afe798f1610\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647474 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z7qq\" (UniqueName: \"kubernetes.io/projected/ac97c016-fcdc-4499-b4d4-6e5478c1de36-kube-api-access-8z7qq\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647516 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac97c016-fcdc-4499-b4d4-6e5478c1de36-signing-cabundle\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647548 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647569 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j69qn\" (UniqueName: \"kubernetes.io/projected/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-kube-api-access-j69qn\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647598 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cxxn\" (UniqueName: \"kubernetes.io/projected/64c8acea-9343-42d1-84cc-168d575e30a5-kube-api-access-5cxxn\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647614 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-socket-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647647 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6db772-e434-4619-b2e3-bacb9b4c527a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647667 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx2rl\" (UniqueName: \"kubernetes.io/projected/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-kube-api-access-zx2rl\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647688 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf598\" (UniqueName: \"kubernetes.io/projected/d35d3a3f-f614-45fa-a59a-e5cefa471321-kube-api-access-mf598\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647707 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-oauth-config\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647724 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-service-ca-bundle\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647745 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99w2k\" (UniqueName: \"kubernetes.io/projected/559634f6-983d-4ae2-959e-8b54abc1326d-kube-api-access-99w2k\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647811 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2lw\" (UniqueName: \"kubernetes.io/projected/65156769-02c6-4cb1-a9ff-c51c8b458135-kube-api-access-6s2lw\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647843 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647875 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-default-certificate\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647930 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-service-ca\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647960 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bbe37e1-bbb1-4298-8427-f8c233470593-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647981 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.647999 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-serving-cert\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648026 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-trusted-ca-bundle\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648044 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d7d3cfa5-43a3-4257-9461-2fd207b53800-certs\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648063 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb280ecc-1666-4a9a-a2b3-910b09de7474-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648085 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z9dx\" (UniqueName: \"kubernetes.io/projected/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-kube-api-access-7z9dx\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648123 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daa64ebc-2612-4a0c-833e-be450fbbd5d0-config-volume\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648139 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfa82d87-b071-46fc-af14-295ff38871aa-proxy-tls\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648157 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648173 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-metrics-certs\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648194 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648216 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jtnb\" (UniqueName: \"kubernetes.io/projected/79d432ec-ac07-4516-a0a0-38fc02ec3e80-kube-api-access-9jtnb\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648235 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648252 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b6db772-e434-4619-b2e3-bacb9b4c527a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648271 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-csi-data-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648307 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71b79282-23b9-4bfd-b5b9-446f82131905-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648336 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7cv\" (UniqueName: \"kubernetes.io/projected/eb280ecc-1666-4a9a-a2b3-910b09de7474-kube-api-access-bn7cv\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648359 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daa64ebc-2612-4a0c-833e-be450fbbd5d0-secret-volume\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648404 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648444 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64c8acea-9343-42d1-84cc-168d575e30a5-config\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb280ecc-1666-4a9a-a2b3-910b09de7474-metrics-tls\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648486 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648508 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c8acea-9343-42d1-84cc-168d575e30a5-trusted-ca\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648535 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/86926fca-c917-498b-a3f3-7315ec1e5370-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648558 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bbe37e1-bbb1-4298-8427-f8c233470593-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648576 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-config\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648595 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cfa82d87-b071-46fc-af14-295ff38871aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648652 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7817860b-74ba-4dec-b243-6f3571884745-cert\") pod \"ingress-canary-5j6kp\" (UID: \"7817860b-74ba-4dec-b243-6f3571884745\") " pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648675 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-registration-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.648704 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-config\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: E1204 15:22:30.650387 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:31.150364993 +0000 UTC m=+158.585034920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.656412 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ae863415-6074-4ce2-9e25-8c0705ed1e80-node-pullsecrets\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.664033 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac97c016-fcdc-4499-b4d4-6e5478c1de36-signing-key\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.664281 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08aef24-f00f-43da-8ac1-79def39914ce-serving-cert\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.664341 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/559634f6-983d-4ae2-959e-8b54abc1326d-apiservice-cert\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.664504 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-ca\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.671476 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-serving-cert\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.678971 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae863415-6074-4ce2-9e25-8c0705ed1e80-audit-dir\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679021 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-serving-cert\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679075 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-etcd-client\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679097 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-plugins-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679116 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5d16762-1e73-4856-9593-ae335bce123b-metrics-tls\") pod \"dns-operator-744455d44c-4h6zp\" (UID: \"d5d16762-1e73-4856-9593-ae335bce123b\") " pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679240 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-serving-cert\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679260 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-image-import-ca\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679304 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4k2\" (UniqueName: \"kubernetes.io/projected/ed5477e6-0f8c-457f-a314-6a8263aa89ac-kube-api-access-gr4k2\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.679324 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qlhf\" (UniqueName: \"kubernetes.io/projected/1348ed48-644b-49f3-b674-92cd4e39d1ec-kube-api-access-7qlhf\") pod \"downloads-7954f5f757-qbw9s\" (UID: \"1348ed48-644b-49f3-b674-92cd4e39d1ec\") " pod="openshift-console/downloads-7954f5f757-qbw9s" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.681266 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ed5477e6-0f8c-457f-a314-6a8263aa89ac-profile-collector-cert\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.682345 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ae863415-6074-4ce2-9e25-8c0705ed1e80-audit-dir\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.683104 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/86926fca-c917-498b-a3f3-7315ec1e5370-profile-collector-cert\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.683404 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/79d432ec-ac07-4516-a0a0-38fc02ec3e80-metrics-tls\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.683461 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsjqk\" (UniqueName: \"kubernetes.io/projected/2352b624-13d5-49ce-ac83-0a72f19879af-kube-api-access-tsjqk\") pod \"multus-admission-controller-857f4d67dd-rbngc\" (UID: \"2352b624-13d5-49ce-ac83-0a72f19879af\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.683489 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cl8\" (UniqueName: \"kubernetes.io/projected/cfa82d87-b071-46fc-af14-295ff38871aa-kube-api-access-g6cl8\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.683523 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/559634f6-983d-4ae2-959e-8b54abc1326d-tmpfs\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.683544 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrp8k\" (UniqueName: \"kubernetes.io/projected/d5d16762-1e73-4856-9593-ae335bce123b-kube-api-access-hrp8k\") pod \"dns-operator-744455d44c-4h6zp\" (UID: \"d5d16762-1e73-4856-9593-ae335bce123b\") " pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684249 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-config\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684554 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-service-ca\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684590 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4cbh\" (UniqueName: \"kubernetes.io/projected/ae863415-6074-4ce2-9e25-8c0705ed1e80-kube-api-access-j4cbh\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684661 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhg22\" (UniqueName: \"kubernetes.io/projected/7817860b-74ba-4dec-b243-6f3571884745-kube-api-access-dhg22\") pod \"ingress-canary-5j6kp\" (UID: \"7817860b-74ba-4dec-b243-6f3571884745\") " pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684685 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/86926fca-c917-498b-a3f3-7315ec1e5370-srv-cert\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684723 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684743 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684859 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmlns\" (UniqueName: \"kubernetes.io/projected/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-kube-api-access-jmlns\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684888 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed5477e6-0f8c-457f-a314-6a8263aa89ac-srv-cert\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684929 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-dir\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684949 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684984 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.685005 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjpvz\" (UniqueName: \"kubernetes.io/projected/d7d3cfa5-43a3-4257-9461-2fd207b53800-kube-api-access-kjpvz\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.685038 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-encryption-config\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.684855 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bbe37e1-bbb1-4298-8427-f8c233470593-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.741305 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-dir\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.751545 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-client\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.751670 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b6db772-e434-4619-b2e3-bacb9b4c527a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.751948 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d7d3cfa5-43a3-4257-9461-2fd207b53800-node-bootstrap-token\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.752388 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsp68\" (UniqueName: \"kubernetes.io/projected/a75359a0-583e-4732-a043-4088c2ca0910-kube-api-access-qsp68\") pod \"migrator-59844c95c7-nltr4\" (UID: \"a75359a0-583e-4732-a043-4088c2ca0910\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.752459 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzdj\" (UniqueName: \"kubernetes.io/projected/86926fca-c917-498b-a3f3-7315ec1e5370-kube-api-access-5fzdj\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.752614 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.752865 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-config\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.752949 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-etcd-serving-ca\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.753206 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/559634f6-983d-4ae2-959e-8b54abc1326d-webhook-cert\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.753240 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79d432ec-ac07-4516-a0a0-38fc02ec3e80-config-volume\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.753316 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv6ll\" (UniqueName: \"kubernetes.io/projected/daa64ebc-2612-4a0c-833e-be450fbbd5d0-kube-api-access-mv6ll\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.753392 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb280ecc-1666-4a9a-a2b3-910b09de7474-trusted-ca\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.753474 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64c8acea-9343-42d1-84cc-168d575e30a5-serving-cert\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.753553 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-oauth-serving-cert\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.753627 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71b79282-23b9-4bfd-b5b9-446f82131905-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.755211 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-oauth-config\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.758271 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-serving-cert\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.759225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/29205e6d-74be-4a99-b92d-50152cb21845-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5hd4h\" (UID: \"29205e6d-74be-4a99-b92d-50152cb21845\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.759433 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-trusted-ca-bundle\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.759962 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/559634f6-983d-4ae2-959e-8b54abc1326d-tmpfs\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.763063 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-service-ca\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.763395 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64c8acea-9343-42d1-84cc-168d575e30a5-config\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.764407 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bbe37e1-bbb1-4298-8427-f8c233470593-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.764807 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.765382 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-ca\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.767127 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/64c8acea-9343-42d1-84cc-168d575e30a5-trusted-ca\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.767279 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/559634f6-983d-4ae2-959e-8b54abc1326d-apiservice-cert\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.767540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.767567 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.768810 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.770104 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb280ecc-1666-4a9a-a2b3-910b09de7474-trusted-ca\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.770890 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-oauth-serving-cert\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.773232 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.773533 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb280ecc-1666-4a9a-a2b3-910b09de7474-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.773933 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.774078 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.774919 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb280ecc-1666-4a9a-a2b3-910b09de7474-metrics-tls\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.775565 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f08aef24-f00f-43da-8ac1-79def39914ce-config\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.776504 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f08aef24-f00f-43da-8ac1-79def39914ce-serving-cert\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.776613 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.776697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/559634f6-983d-4ae2-959e-8b54abc1326d-webhook-cert\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.777367 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99w2k\" (UniqueName: \"kubernetes.io/projected/559634f6-983d-4ae2-959e-8b54abc1326d-kube-api-access-99w2k\") pod \"packageserver-d55dfcdfc-k55v7\" (UID: \"559634f6-983d-4ae2-959e-8b54abc1326d\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.778718 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f08aef24-f00f-43da-8ac1-79def39914ce-etcd-client\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.778739 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.779625 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64c8acea-9343-42d1-84cc-168d575e30a5-serving-cert\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.782608 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71b79282-23b9-4bfd-b5b9-446f82131905-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783265 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkcj\" (UniqueName: \"kubernetes.io/projected/6bbe37e1-bbb1-4298-8427-f8c233470593-kube-api-access-tnkcj\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783363 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-config\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783432 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-policies\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783481 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fhzk\" (UniqueName: \"kubernetes.io/projected/29205e6d-74be-4a99-b92d-50152cb21845-kube-api-access-6fhzk\") pod \"control-plane-machine-set-operator-78cbb6b69f-5hd4h\" (UID: \"29205e6d-74be-4a99-b92d-50152cb21845\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783520 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b79282-23b9-4bfd-b5b9-446f82131905-config\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783554 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e89c9638-4420-465f-b9f4-0afe798f1610-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g7j5k\" (UID: \"e89c9638-4420-465f-b9f4-0afe798f1610\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783590 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92pff\" (UniqueName: \"kubernetes.io/projected/f08aef24-f00f-43da-8ac1-79def39914ce-kube-api-access-92pff\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783619 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-etcd-client\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.783621 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2352b624-13d5-49ce-ac83-0a72f19879af-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rbngc\" (UID: \"2352b624-13d5-49ce-ac83-0a72f19879af\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.784289 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.784440 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.784743 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-policies\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.785081 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87t45\" (UniqueName: \"kubernetes.io/projected/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-kube-api-access-87t45\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.785162 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ed5477e6-0f8c-457f-a314-6a8263aa89ac-srv-cert\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.785560 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-audit\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.785889 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-config\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.786028 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71b79282-23b9-4bfd-b5b9-446f82131905-config\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.786269 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-service-ca\") pod \"console-f9d7485db-mtj84\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.786402 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-image-import-ca\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.786667 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7cv\" (UniqueName: \"kubernetes.io/projected/eb280ecc-1666-4a9a-a2b3-910b09de7474-kube-api-access-bn7cv\") pod \"ingress-operator-5b745b69d9-jc9zk\" (UID: \"eb280ecc-1666-4a9a-a2b3-910b09de7474\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.787087 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.787538 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ae863415-6074-4ce2-9e25-8c0705ed1e80-etcd-serving-ca\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.789086 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/86926fca-c917-498b-a3f3-7315ec1e5370-srv-cert\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.796725 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.800035 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ae863415-6074-4ce2-9e25-8c0705ed1e80-encryption-config\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.800680 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4k2\" (UniqueName: \"kubernetes.io/projected/ed5477e6-0f8c-457f-a314-6a8263aa89ac-kube-api-access-gr4k2\") pod \"catalog-operator-68c6474976-njwq9\" (UID: \"ed5477e6-0f8c-457f-a314-6a8263aa89ac\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.800828 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qlhf\" (UniqueName: \"kubernetes.io/projected/1348ed48-644b-49f3-b674-92cd4e39d1ec-kube-api-access-7qlhf\") pod \"downloads-7954f5f757-qbw9s\" (UID: \"1348ed48-644b-49f3-b674-92cd4e39d1ec\") " pod="openshift-console/downloads-7954f5f757-qbw9s" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.816794 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71b79282-23b9-4bfd-b5b9-446f82131905-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p2mg9\" (UID: \"71b79282-23b9-4bfd-b5b9-446f82131905\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.822826 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.829481 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.844584 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cxxn\" (UniqueName: \"kubernetes.io/projected/64c8acea-9343-42d1-84cc-168d575e30a5-kube-api-access-5cxxn\") pod \"console-operator-58897d9998-qlskj\" (UID: \"64c8acea-9343-42d1-84cc-168d575e30a5\") " pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:30 crc kubenswrapper[4676]: I1204 15:22:30.860803 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf598\" (UniqueName: \"kubernetes.io/projected/d35d3a3f-f614-45fa-a59a-e5cefa471321-kube-api-access-mf598\") pod \"oauth-openshift-558db77b4-675c2\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.051308 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.073444 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.073878 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv6ll\" (UniqueName: \"kubernetes.io/projected/daa64ebc-2612-4a0c-833e-be450fbbd5d0-kube-api-access-mv6ll\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.073977 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-config\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074016 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e89c9638-4420-465f-b9f4-0afe798f1610-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g7j5k\" (UID: \"e89c9638-4420-465f-b9f4-0afe798f1610\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074153 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2352b624-13d5-49ce-ac83-0a72f19879af-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rbngc\" (UID: \"2352b624-13d5-49ce-ac83-0a72f19879af\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-stats-auth\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074218 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-mountpoint-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074242 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdjw9\" (UniqueName: \"kubernetes.io/projected/e89c9638-4420-465f-b9f4-0afe798f1610-kube-api-access-zdjw9\") pod \"package-server-manager-789f6589d5-g7j5k\" (UID: \"e89c9638-4420-465f-b9f4-0afe798f1610\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074263 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z7qq\" (UniqueName: \"kubernetes.io/projected/ac97c016-fcdc-4499-b4d4-6e5478c1de36-kube-api-access-8z7qq\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074284 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j69qn\" (UniqueName: \"kubernetes.io/projected/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-kube-api-access-j69qn\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074306 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac97c016-fcdc-4499-b4d4-6e5478c1de36-signing-cabundle\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074327 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-socket-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074364 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx2rl\" (UniqueName: \"kubernetes.io/projected/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-kube-api-access-zx2rl\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074392 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6db772-e434-4619-b2e3-bacb9b4c527a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074417 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-service-ca-bundle\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074447 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-default-certificate\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074494 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2lw\" (UniqueName: \"kubernetes.io/projected/65156769-02c6-4cb1-a9ff-c51c8b458135-kube-api-access-6s2lw\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074518 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d7d3cfa5-43a3-4257-9461-2fd207b53800-certs\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074542 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z9dx\" (UniqueName: \"kubernetes.io/projected/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-kube-api-access-7z9dx\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074573 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-metrics-certs\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074591 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daa64ebc-2612-4a0c-833e-be450fbbd5d0-config-volume\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074613 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfa82d87-b071-46fc-af14-295ff38871aa-proxy-tls\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074631 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jtnb\" (UniqueName: \"kubernetes.io/projected/79d432ec-ac07-4516-a0a0-38fc02ec3e80-kube-api-access-9jtnb\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074653 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074686 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b6db772-e434-4619-b2e3-bacb9b4c527a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074681 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074822 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-csi-data-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.074707 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-csi-data-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075468 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daa64ebc-2612-4a0c-833e-be450fbbd5d0-secret-volume\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075515 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075559 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cfa82d87-b071-46fc-af14-295ff38871aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075608 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7817860b-74ba-4dec-b243-6f3571884745-cert\") pod \"ingress-canary-5j6kp\" (UID: \"7817860b-74ba-4dec-b243-6f3571884745\") " pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-registration-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075663 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac97c016-fcdc-4499-b4d4-6e5478c1de36-signing-key\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075690 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-serving-cert\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075721 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-plugins-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075752 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5d16762-1e73-4856-9593-ae335bce123b-metrics-tls\") pod \"dns-operator-744455d44c-4h6zp\" (UID: \"d5d16762-1e73-4856-9593-ae335bce123b\") " pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075802 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/79d432ec-ac07-4516-a0a0-38fc02ec3e80-metrics-tls\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075824 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsjqk\" (UniqueName: \"kubernetes.io/projected/2352b624-13d5-49ce-ac83-0a72f19879af-kube-api-access-tsjqk\") pod \"multus-admission-controller-857f4d67dd-rbngc\" (UID: \"2352b624-13d5-49ce-ac83-0a72f19879af\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075848 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cl8\" (UniqueName: \"kubernetes.io/projected/cfa82d87-b071-46fc-af14-295ff38871aa-kube-api-access-g6cl8\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075887 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrp8k\" (UniqueName: \"kubernetes.io/projected/d5d16762-1e73-4856-9593-ae335bce123b-kube-api-access-hrp8k\") pod \"dns-operator-744455d44c-4h6zp\" (UID: \"d5d16762-1e73-4856-9593-ae335bce123b\") " pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075937 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhg22\" (UniqueName: \"kubernetes.io/projected/7817860b-74ba-4dec-b243-6f3571884745-kube-api-access-dhg22\") pod \"ingress-canary-5j6kp\" (UID: \"7817860b-74ba-4dec-b243-6f3571884745\") " pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075961 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmlns\" (UniqueName: \"kubernetes.io/projected/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-kube-api-access-jmlns\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.075987 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.076008 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjpvz\" (UniqueName: \"kubernetes.io/projected/d7d3cfa5-43a3-4257-9461-2fd207b53800-kube-api-access-kjpvz\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.076045 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b6db772-e434-4619-b2e3-bacb9b4c527a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.076074 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d7d3cfa5-43a3-4257-9461-2fd207b53800-node-bootstrap-token\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.076104 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.076125 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79d432ec-ac07-4516-a0a0-38fc02ec3e80-config-volume\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.076843 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/79d432ec-ac07-4516-a0a0-38fc02ec3e80-config-volume\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.076939 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:31.576876598 +0000 UTC m=+159.011546455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.076983 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-qbw9s" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.079014 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.079295 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-registration-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.079782 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cfa82d87-b071-46fc-af14-295ff38871aa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.080836 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daa64ebc-2612-4a0c-833e-be450fbbd5d0-config-volume\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.081615 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b6db772-e434-4619-b2e3-bacb9b4c527a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.082013 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-config\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.085493 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-plugins-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.087610 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-service-ca-bundle\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.106238 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.109771 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.110479 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-mountpoint-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.115957 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65156769-02c6-4cb1-a9ff-c51c8b458135-socket-dir\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.121232 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.124382 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac97c016-fcdc-4499-b4d4-6e5478c1de36-signing-cabundle\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.128894 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d5d16762-1e73-4856-9593-ae335bce123b-metrics-tls\") pod \"dns-operator-744455d44c-4h6zp\" (UID: \"d5d16762-1e73-4856-9593-ae335bce123b\") " pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.129434 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-serving-cert\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.129857 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-metrics-certs\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.130167 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac97c016-fcdc-4499-b4d4-6e5478c1de36-signing-key\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.130528 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-default-certificate\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.131308 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92pff\" (UniqueName: \"kubernetes.io/projected/f08aef24-f00f-43da-8ac1-79def39914ce-kube-api-access-92pff\") pod \"etcd-operator-b45778765-k7tn2\" (UID: \"f08aef24-f00f-43da-8ac1-79def39914ce\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.131306 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.132151 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4cbh\" (UniqueName: \"kubernetes.io/projected/ae863415-6074-4ce2-9e25-8c0705ed1e80-kube-api-access-j4cbh\") pod \"apiserver-76f77b778f-x25bq\" (UID: \"ae863415-6074-4ce2-9e25-8c0705ed1e80\") " pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.132647 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/79d432ec-ac07-4516-a0a0-38fc02ec3e80-metrics-tls\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.132933 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-stats-auth\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.133884 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e89c9638-4420-465f-b9f4-0afe798f1610-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-g7j5k\" (UID: \"e89c9638-4420-465f-b9f4-0afe798f1610\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.134308 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2352b624-13d5-49ce-ac83-0a72f19879af-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rbngc\" (UID: \"2352b624-13d5-49ce-ac83-0a72f19879af\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.134958 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsp68\" (UniqueName: \"kubernetes.io/projected/a75359a0-583e-4732-a043-4088c2ca0910-kube-api-access-qsp68\") pod \"migrator-59844c95c7-nltr4\" (UID: \"a75359a0-583e-4732-a043-4088c2ca0910\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.131136 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv6ll\" (UniqueName: \"kubernetes.io/projected/daa64ebc-2612-4a0c-833e-be450fbbd5d0-kube-api-access-mv6ll\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.135234 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daa64ebc-2612-4a0c-833e-be450fbbd5d0-secret-volume\") pod \"collect-profiles-29414355-rpgmw\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.136764 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z9dx\" (UniqueName: \"kubernetes.io/projected/f7de5a66-87ae-4f5f-8f21-f9f6bff749da-kube-api-access-7z9dx\") pod \"service-ca-operator-777779d784-z4wmg\" (UID: \"f7de5a66-87ae-4f5f-8f21-f9f6bff749da\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.137375 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdjw9\" (UniqueName: \"kubernetes.io/projected/e89c9638-4420-465f-b9f4-0afe798f1610-kube-api-access-zdjw9\") pod \"package-server-manager-789f6589d5-g7j5k\" (UID: \"e89c9638-4420-465f-b9f4-0afe798f1610\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.140088 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2lw\" (UniqueName: \"kubernetes.io/projected/65156769-02c6-4cb1-a9ff-c51c8b458135-kube-api-access-6s2lw\") pod \"csi-hostpathplugin-4hnbc\" (UID: \"65156769-02c6-4cb1-a9ff-c51c8b458135\") " pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.140109 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z7qq\" (UniqueName: \"kubernetes.io/projected/ac97c016-fcdc-4499-b4d4-6e5478c1de36-kube-api-access-8z7qq\") pod \"service-ca-9c57cc56f-nvsfq\" (UID: \"ac97c016-fcdc-4499-b4d4-6e5478c1de36\") " pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.141035 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7817860b-74ba-4dec-b243-6f3571884745-cert\") pod \"ingress-canary-5j6kp\" (UID: \"7817860b-74ba-4dec-b243-6f3571884745\") " pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.142125 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.143664 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzdj\" (UniqueName: \"kubernetes.io/projected/86926fca-c917-498b-a3f3-7315ec1e5370-kube-api-access-5fzdj\") pod \"olm-operator-6b444d44fb-ls5xb\" (UID: \"86926fca-c917-498b-a3f3-7315ec1e5370\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.143759 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw"] Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.145519 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d7d3cfa5-43a3-4257-9461-2fd207b53800-node-bootstrap-token\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.154204 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.155317 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d7d3cfa5-43a3-4257-9461-2fd207b53800-certs\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.158016 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cfa82d87-b071-46fc-af14-295ff38871aa-proxy-tls\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.158260 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b6db772-e434-4619-b2e3-bacb9b4c527a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.158540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkcj\" (UniqueName: \"kubernetes.io/projected/6bbe37e1-bbb1-4298-8427-f8c233470593-kube-api-access-tnkcj\") pod \"kube-storage-version-migrator-operator-b67b599dd-2gzzj\" (UID: \"6bbe37e1-bbb1-4298-8427-f8c233470593\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.159192 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jtnb\" (UniqueName: \"kubernetes.io/projected/79d432ec-ac07-4516-a0a0-38fc02ec3e80-kube-api-access-9jtnb\") pod \"dns-default-wk9bw\" (UID: \"79d432ec-ac07-4516-a0a0-38fc02ec3e80\") " pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.159646 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fhzk\" (UniqueName: \"kubernetes.io/projected/29205e6d-74be-4a99-b92d-50152cb21845-kube-api-access-6fhzk\") pod \"control-plane-machine-set-operator-78cbb6b69f-5hd4h\" (UID: \"29205e6d-74be-4a99-b92d-50152cb21845\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.159625 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.163351 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjpvz\" (UniqueName: \"kubernetes.io/projected/d7d3cfa5-43a3-4257-9461-2fd207b53800-kube-api-access-kjpvz\") pod \"machine-config-server-b6qzl\" (UID: \"d7d3cfa5-43a3-4257-9461-2fd207b53800\") " pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.179416 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.180106 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:31.680085496 +0000 UTC m=+159.114755353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.240508 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.253071 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.271375 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.280899 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.281183 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:31.781159725 +0000 UTC m=+159.215829582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.281460 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.281987 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:31.781977368 +0000 UTC m=+159.216647225 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.282132 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.366459 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.367461 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-b6qzl" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.368974 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.369442 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.371410 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.378331 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.383878 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.384463 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:31.884445725 +0000 UTC m=+159.319115582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.386721 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.503584 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.504094 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.004075063 +0000 UTC m=+159.438744920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.509176 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b6db772-e434-4619-b2e3-bacb9b4c527a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-v8rmq\" (UID: \"9b6db772-e434-4619-b2e3-bacb9b4c527a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.516695 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j69qn\" (UniqueName: \"kubernetes.io/projected/54fb0764-8ac7-48d5-87ce-e2c15115ae6a-kube-api-access-j69qn\") pod \"openshift-controller-manager-operator-756b6f6bc6-jmxc7\" (UID: \"54fb0764-8ac7-48d5-87ce-e2c15115ae6a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.523437 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsjqk\" (UniqueName: \"kubernetes.io/projected/2352b624-13d5-49ce-ac83-0a72f19879af-kube-api-access-tsjqk\") pod \"multus-admission-controller-857f4d67dd-rbngc\" (UID: \"2352b624-13d5-49ce-ac83-0a72f19879af\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.523803 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmlns\" (UniqueName: \"kubernetes.io/projected/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-kube-api-access-jmlns\") pod \"marketplace-operator-79b997595-4627g\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.525842 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.526206 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.526420 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cl8\" (UniqueName: \"kubernetes.io/projected/cfa82d87-b071-46fc-af14-295ff38871aa-kube-api-access-g6cl8\") pod \"machine-config-controller-84d6567774-vqkqz\" (UID: \"cfa82d87-b071-46fc-af14-295ff38871aa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.529067 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.529559 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhg22\" (UniqueName: \"kubernetes.io/projected/7817860b-74ba-4dec-b243-6f3571884745-kube-api-access-dhg22\") pod \"ingress-canary-5j6kp\" (UID: \"7817860b-74ba-4dec-b243-6f3571884745\") " pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.529629 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-cts56"] Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.534066 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx2rl\" (UniqueName: \"kubernetes.io/projected/6f91c5fa-e347-44f5-8229-cdaa1db9b7a0-kube-api-access-zx2rl\") pod \"router-default-5444994796-nrpqk\" (UID: \"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0\") " pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.535753 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn"] Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.543229 4676 generic.go:334] "Generic (PLEG): container finished" podID="fdf10486-0860-4dad-984e-d82daaac8ecd" containerID="e64c4cfec9d5ee51ca7fa04a582801b6a20b7e35e0e510875d82eb10a1d7eb02" exitCode=0 Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.543647 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" event={"ID":"fdf10486-0860-4dad-984e-d82daaac8ecd","Type":"ContainerDied","Data":"e64c4cfec9d5ee51ca7fa04a582801b6a20b7e35e0e510875d82eb10a1d7eb02"} Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.549105 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrp8k\" (UniqueName: \"kubernetes.io/projected/d5d16762-1e73-4856-9593-ae335bce123b-kube-api-access-hrp8k\") pod \"dns-operator-744455d44c-4h6zp\" (UID: \"d5d16762-1e73-4856-9593-ae335bce123b\") " pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.555696 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2"] Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.555736 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" event={"ID":"591b399c-21b2-4c6f-ab3a-d424df670c0b","Type":"ContainerStarted","Data":"476c1d841b71b355a86d80c332a53ac94962c9f3bf87315f6d51bc4ed6f0dca2"} Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.555762 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.564707 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" event={"ID":"a735889f-51fc-49e1-8756-4f9dc2c05d94","Type":"ContainerStarted","Data":"50493e69647a3cf0ad71e43163442a5f7155134f6e94b53bee84bced380052c7"} Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.568225 4676 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-dlhc6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.568364 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" podUID="591b399c-21b2-4c6f-ab3a-d424df670c0b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.590651 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5j6kp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.605740 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.606890 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.106869519 +0000 UTC m=+159.541539376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.742652 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.743092 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.24305467 +0000 UTC m=+159.677724517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.763571 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.771641 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.782374 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.814867 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.843891 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.844208 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.344189251 +0000 UTC m=+159.778859108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:31 crc kubenswrapper[4676]: I1204 15:22:31.945617 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:31 crc kubenswrapper[4676]: E1204 15:22:31.946215 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.446194596 +0000 UTC m=+159.880864443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.084590 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:32 crc kubenswrapper[4676]: E1204 15:22:32.085806 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.585750759 +0000 UTC m=+160.020420616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.187118 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:32 crc kubenswrapper[4676]: E1204 15:22:32.187871 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.687836416 +0000 UTC m=+160.122506323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.338788 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:32 crc kubenswrapper[4676]: E1204 15:22:32.351095 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.850777969 +0000 UTC m=+160.285447826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.449054 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:32 crc kubenswrapper[4676]: E1204 15:22:32.450218 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:32.950202983 +0000 UTC m=+160.384872830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.551156 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:32 crc kubenswrapper[4676]: E1204 15:22:32.552575 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.052552817 +0000 UTC m=+160.487222674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.595777 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-b6qzl" event={"ID":"d7d3cfa5-43a3-4257-9461-2fd207b53800","Type":"ContainerStarted","Data":"ccf64c3bbf57083923ca027255c7e4c4301fb40683ce2daa2e35ab5e3e95fd47"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.601009 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" event={"ID":"685f9e11-cab9-4f06-bcfe-9931c77f4d23","Type":"ContainerStarted","Data":"a0ff0fc5892bf336f6fa17d26469fc05e17fa41eece5c6903d23295ef04a8a68"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.603494 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" event={"ID":"76f9c064-9769-41c0-8936-340f895bc36d","Type":"ContainerStarted","Data":"c20ba3f8b07ce834e4d4a2b67b36c6caeb5f9b526b7befc5119c5a2a5298d979"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.608721 4676 generic.go:334] "Generic (PLEG): container finished" podID="d3e5dc91-43ef-4a63-9898-504dfd9b4398" containerID="afc2c3f86b20e617f8db45a021bf2346d99f0c5fa76fae7931e1dce511f312c9" exitCode=0 Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.608820 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" event={"ID":"d3e5dc91-43ef-4a63-9898-504dfd9b4398","Type":"ContainerDied","Data":"afc2c3f86b20e617f8db45a021bf2346d99f0c5fa76fae7931e1dce511f312c9"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.637371 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" event={"ID":"e9bbf7af-9cc9-4dec-a933-dff6683aa16a","Type":"ContainerStarted","Data":"ccfba15af3433234b4944207c089a8fc83f0f3749edcd749e6258d68f9115e4f"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.644610 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" podStartSLOduration=134.644584649 podStartE2EDuration="2m14.644584649s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:32.145874476 +0000 UTC m=+159.580544353" watchObservedRunningTime="2025-12-04 15:22:32.644584649 +0000 UTC m=+160.079254506" Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.645742 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" event={"ID":"7bdebf26-30a2-44be-88b4-24d230d01708","Type":"ContainerStarted","Data":"fbce5b7f16c365d41ea362ad27e892d4656699542504ef389a6f74d88fc0d4e4"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.653604 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:32 crc kubenswrapper[4676]: E1204 15:22:32.654292 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.154268084 +0000 UTC m=+160.588937991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.654839 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" event={"ID":"662295c5-dfd2-4536-bd74-8d5624100ea5","Type":"ContainerStarted","Data":"161f88c8571367655fdc309ebfa222769cfa0a2b30a267ad2edf2b4c689dca19"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.662345 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" event={"ID":"a08b22ef-20e1-4a1c-bec4-e35311bf926b","Type":"ContainerStarted","Data":"8b5871be4d4e17dd3e2ddbf5fb3da9301e90cc44bfddb10d4fb85d43bd1ffbbe"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.668931 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" event={"ID":"3162c38f-2d77-4c34-a890-a8f321e1eebc","Type":"ContainerStarted","Data":"0e4026e99d11572d098182be52ae41c93ed217ebabea69e226635e6ba7137f7c"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.675743 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" event={"ID":"b92bcd16-c0e2-4cb6-8a6b-63aa9d09e290","Type":"ContainerStarted","Data":"2ea2e206e23820157f5939f849904d7caf86d2cc4e65951c48b914ab77636721"} Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.684568 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.712984 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-8t6kz" podStartSLOduration=134.712956842 podStartE2EDuration="2m14.712956842s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:32.712732506 +0000 UTC m=+160.147402363" watchObservedRunningTime="2025-12-04 15:22:32.712956842 +0000 UTC m=+160.147626719" Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.714036 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jwhjf" podStartSLOduration=134.714024681 podStartE2EDuration="2m14.714024681s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:32.682565189 +0000 UTC m=+160.117235046" watchObservedRunningTime="2025-12-04 15:22:32.714024681 +0000 UTC m=+160.148694538" Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.733446 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-fz52v" podStartSLOduration=134.733420883 podStartE2EDuration="2m14.733420883s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:32.732004924 +0000 UTC m=+160.166674811" watchObservedRunningTime="2025-12-04 15:22:32.733420883 +0000 UTC m=+160.168090740" Dec 04 15:22:32 crc kubenswrapper[4676]: I1204 15:22:32.919377 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:32 crc kubenswrapper[4676]: E1204 15:22:32.925019 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.4249497 +0000 UTC m=+160.859619557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.021411 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.031677 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.531655133 +0000 UTC m=+160.966324990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.127695 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.128036 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.628000743 +0000 UTC m=+161.062670600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.128232 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.128701 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.628692432 +0000 UTC m=+161.063362289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.230757 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.231056 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.731009575 +0000 UTC m=+161.165679432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.231248 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.231893 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.731862419 +0000 UTC m=+161.166532276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.332411 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.332753 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.832701561 +0000 UTC m=+161.267371428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.433765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.434210 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:33.934195812 +0000 UTC m=+161.368865669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.618110 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.618515 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.118460221 +0000 UTC m=+161.553130088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.618865 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.626657 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.126612794 +0000 UTC m=+161.561282651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.727799 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.728748 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.228724421 +0000 UTC m=+161.663394288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.765246 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.766274 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" event={"ID":"a735889f-51fc-49e1-8756-4f9dc2c05d94","Type":"ContainerStarted","Data":"f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076"} Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.766799 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.778863 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nrpqk" event={"ID":"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0","Type":"ContainerStarted","Data":"20cbebadd063563d398f21ae1f5faa433668c827098325cee380a017cf800e44"} Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.822568 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" podStartSLOduration=134.822518501 podStartE2EDuration="2m14.822518501s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:33.820234609 +0000 UTC m=+161.254904486" watchObservedRunningTime="2025-12-04 15:22:33.822518501 +0000 UTC m=+161.257188358" Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.838779 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.841618 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.341598884 +0000 UTC m=+161.776268741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.940621 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.940864 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.440840553 +0000 UTC m=+161.875510410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:33 crc kubenswrapper[4676]: I1204 15:22:33.941102 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:33 crc kubenswrapper[4676]: E1204 15:22:33.942353 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.442337534 +0000 UTC m=+161.877007431 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.042710 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.042993 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.542956051 +0000 UTC m=+161.977625908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.043167 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.043769 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.543751592 +0000 UTC m=+161.978421509 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.144858 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.145843 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.146002 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.645979413 +0000 UTC m=+162.080649280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.146210 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.146533 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.646522118 +0000 UTC m=+162.081191975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.252765 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.253173 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.75315003 +0000 UTC m=+162.187819887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.354164 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.354713 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.854695052 +0000 UTC m=+162.289364909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.404637 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-mtj84"] Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.412984 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qlskj"] Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.425712 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk"] Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.434849 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7"] Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.458514 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.459542 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:34.959521294 +0000 UTC m=+162.394191151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.485007 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-qbw9s"] Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.567040 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.567545 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.067528013 +0000 UTC m=+162.502197870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.674157 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.674770 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.17474798 +0000 UTC m=+162.609417837 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.779407 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.780056 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.280037465 +0000 UTC m=+162.714707322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.821318 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" event={"ID":"e9bbf7af-9cc9-4dec-a933-dff6683aa16a","Type":"ContainerStarted","Data":"b08b7720089852018d2ad51f8dd972aed1ba08fcbc293534e4c8c8b5411876cb"} Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.834272 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" event={"ID":"eb280ecc-1666-4a9a-a2b3-910b09de7474","Type":"ContainerStarted","Data":"8c036eb579edec75e3e621cadc4f164904b45b999e98a73713419b9b1bc718d8"} Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.893477 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mtj84" event={"ID":"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362","Type":"ContainerStarted","Data":"bf8fecebf4d575dfd03e57f6a6aa298c07db2809d2bd4ae54a626d6bae980cd7"} Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.895489 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rqcz2" podStartSLOduration=136.895463947 podStartE2EDuration="2m16.895463947s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:34.894718417 +0000 UTC m=+162.329388294" watchObservedRunningTime="2025-12-04 15:22:34.895463947 +0000 UTC m=+162.330133804" Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.895940 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:34 crc kubenswrapper[4676]: E1204 15:22:34.896941 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.396889926 +0000 UTC m=+162.831559783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.910122 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" event={"ID":"685f9e11-cab9-4f06-bcfe-9931c77f4d23","Type":"ContainerStarted","Data":"bc2bacca59841bb621eb83fbe6781e9f4d02149862e82f05a83740253da10509"} Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.952048 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-f7kvn" podStartSLOduration=135.952018597 podStartE2EDuration="2m15.952018597s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:34.941817707 +0000 UTC m=+162.376487564" watchObservedRunningTime="2025-12-04 15:22:34.952018597 +0000 UTC m=+162.386688464" Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.952387 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" event={"ID":"662295c5-dfd2-4536-bd74-8d5624100ea5","Type":"ContainerStarted","Data":"2a76f5cb6932b6a3ed80620625f84186f8e6a5e990c1b001e9f85b35c97a706f"} Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.952467 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" event={"ID":"662295c5-dfd2-4536-bd74-8d5624100ea5","Type":"ContainerStarted","Data":"003a079c2d804e8ca03db5ed6201a15747382659fa3019749e86168fe8df316e"} Dec 04 15:22:34 crc kubenswrapper[4676]: I1204 15:22:34.952497 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.002082 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.002749 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-nrpqk" event={"ID":"6f91c5fa-e347-44f5-8229-cdaa1db9b7a0","Type":"ContainerStarted","Data":"a51320a81dbc37ae75a14c95c228119f8469ec8ad3fc8fcdd9cd10cb4cfd4c99"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.003628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.004966 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.504951107 +0000 UTC m=+162.939620964 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.024264 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-b6qzl" event={"ID":"d7d3cfa5-43a3-4257-9461-2fd207b53800","Type":"ContainerStarted","Data":"41e936f5a7d8dc0fc651f6335a8fc70ee70d0422556dab9d019a8759f9e3e9cc"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.026402 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wk9bw"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.055064 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" event={"ID":"fdf10486-0860-4dad-984e-d82daaac8ecd","Type":"ContainerStarted","Data":"8644759a635ef2338f2aca9ba39ad50d8d280db29af93c3c19607a016072b120"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.056134 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.057453 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-cts56" podStartSLOduration=136.057425515 podStartE2EDuration="2m16.057425515s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.056640343 +0000 UTC m=+162.491310210" watchObservedRunningTime="2025-12-04 15:22:35.057425515 +0000 UTC m=+162.492095362" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.077101 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" event={"ID":"76f9c064-9769-41c0-8936-340f895bc36d","Type":"ContainerStarted","Data":"34030f2499bf20f448312022ee8a026d05ac32cca3b936435fc871880ccea346"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.077634 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.109693 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" podStartSLOduration=137.109669396 podStartE2EDuration="2m17.109669396s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.098290014 +0000 UTC m=+162.532959881" watchObservedRunningTime="2025-12-04 15:22:35.109669396 +0000 UTC m=+162.544339253" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.109794 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qlskj" event={"ID":"64c8acea-9343-42d1-84cc-168d575e30a5","Type":"ContainerStarted","Data":"9112411fa1f725ff2e7cd9fa43cda97be7497e8a147c96f290173cddd46f48be"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.109856 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.110236 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.111880 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.611851866 +0000 UTC m=+163.046521723 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.131608 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" event={"ID":"559634f6-983d-4ae2-959e-8b54abc1326d","Type":"ContainerStarted","Data":"d3f0301264ee599f0add59b4d96b1559a7412bb0ea022a8d146de003258b2de9"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.132497 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.132665 4676 patch_prober.go:28] interesting pod/console-operator-58897d9998-qlskj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.132758 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qlskj" podUID="64c8acea-9343-42d1-84cc-168d575e30a5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.139370 4676 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-k55v7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" start-of-body= Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.139430 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" podUID="559634f6-983d-4ae2-959e-8b54abc1326d" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.17:5443/healthz\": dial tcp 10.217.0.17:5443: connect: connection refused" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.170005 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.180428 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-b6qzl" podStartSLOduration=8.180390384 podStartE2EDuration="8.180390384s" podCreationTimestamp="2025-12-04 15:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.150307229 +0000 UTC m=+162.584977076" watchObservedRunningTime="2025-12-04 15:22:35.180390384 +0000 UTC m=+162.615060241" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.183708 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.196364 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k7tn2"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.200500 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4h6zp"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.208128 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.218128 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.220052 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.72003148 +0000 UTC m=+163.154701337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.228417 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-nrpqk" podStartSLOduration=136.228385859 podStartE2EDuration="2m16.228385859s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.217689656 +0000 UTC m=+162.652359513" watchObservedRunningTime="2025-12-04 15:22:35.228385859 +0000 UTC m=+162.663055706" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.230280 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-675c2"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.245222 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.246936 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" event={"ID":"d3e5dc91-43ef-4a63-9898-504dfd9b4398","Type":"ContainerStarted","Data":"2cace520daf86056c6dcca3c925ecd8b80134016e4619791a58a886539f222f1"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.246998 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.266119 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x25bq"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.290556 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4627g"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.341970 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.342612 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.842589508 +0000 UTC m=+163.277259365 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.346521 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nvsfq"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.346582 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" event={"ID":"7bdebf26-30a2-44be-88b4-24d230d01708","Type":"ContainerStarted","Data":"12826deb2ff079a851140165d39ddc1ab3952ee28b34995e5851010da7d353c9"} Dec 04 15:22:35 crc kubenswrapper[4676]: W1204 15:22:35.383336 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86926fca_c917_498b_a3f3_7315ec1e5370.slice/crio-cfcea0e6bf842e166137f6663247f2a2c359eb8f3c838ad03deead4f5892b3cd WatchSource:0}: Error finding container cfcea0e6bf842e166137f6663247f2a2c359eb8f3c838ad03deead4f5892b3cd: Status 404 returned error can't find the container with id cfcea0e6bf842e166137f6663247f2a2c359eb8f3c838ad03deead4f5892b3cd Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.387799 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" podStartSLOduration=136.387767055 podStartE2EDuration="2m16.387767055s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.260963781 +0000 UTC m=+162.695633648" watchObservedRunningTime="2025-12-04 15:22:35.387767055 +0000 UTC m=+162.822436912" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.392608 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qlskj" podStartSLOduration=137.392597078 podStartE2EDuration="2m17.392597078s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.289415321 +0000 UTC m=+162.724085188" watchObservedRunningTime="2025-12-04 15:22:35.392597078 +0000 UTC m=+162.827266935" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.463535 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.471194 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:35.971174 +0000 UTC m=+163.405843857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.514772 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qbw9s" event={"ID":"1348ed48-644b-49f3-b674-92cd4e39d1ec","Type":"ContainerStarted","Data":"ef7f82a5f686cac8e450b477a4f799c3f4435fd4fabcf28a35b8fde5e640e4ab"} Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.514871 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.514915 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rbngc"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.514936 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.514951 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.514978 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.514995 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5j6kp"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.532004 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-8k7hs" podStartSLOduration=136.531967116 podStartE2EDuration="2m16.531967116s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.394589672 +0000 UTC m=+162.829259529" watchObservedRunningTime="2025-12-04 15:22:35.531967116 +0000 UTC m=+162.966636983" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.533199 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-4hnbc"] Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.566685 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.570490 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.07045541 +0000 UTC m=+163.505125267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.574843 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" podStartSLOduration=136.57480954 podStartE2EDuration="2m16.57480954s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.51496594 +0000 UTC m=+162.949635817" watchObservedRunningTime="2025-12-04 15:22:35.57480954 +0000 UTC m=+163.009479397" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.576179 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-kh68m" podStartSLOduration=137.576171287 podStartE2EDuration="2m17.576171287s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.5605875 +0000 UTC m=+162.995257377" watchObservedRunningTime="2025-12-04 15:22:35.576171287 +0000 UTC m=+163.010841144" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.657932 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-qbw9s" podStartSLOduration=137.657882236 podStartE2EDuration="2m17.657882236s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:35.641475766 +0000 UTC m=+163.076145633" watchObservedRunningTime="2025-12-04 15:22:35.657882236 +0000 UTC m=+163.092552093" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.669210 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.670090 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.17007499 +0000 UTC m=+163.604744847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.770772 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.770934 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.270885912 +0000 UTC m=+163.705555769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.771452 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.771928 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.271893669 +0000 UTC m=+163.706563526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.816082 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.831325 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:35 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:35 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:35 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.831413 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.872656 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.873053 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.37300925 +0000 UTC m=+163.807679107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:35 crc kubenswrapper[4676]: I1204 15:22:35.975130 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:35 crc kubenswrapper[4676]: E1204 15:22:35.975495 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.475477286 +0000 UTC m=+163.910147143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.076749 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.077760 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.577736398 +0000 UTC m=+164.012406255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.179465 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.180408 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.680361189 +0000 UTC m=+164.115031046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.283474 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.283891 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.783850055 +0000 UTC m=+164.218519912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.284138 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.284581 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.784567514 +0000 UTC m=+164.219237371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.386225 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.386554 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.886536988 +0000 UTC m=+164.321206845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.429694 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" event={"ID":"daa64ebc-2612-4a0c-833e-be450fbbd5d0","Type":"ContainerStarted","Data":"0ddeb78d4851d219366fb49cdba1856aa5738f1f12cf4021ec533e98fb2cb108"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.483196 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" event={"ID":"ac97c016-fcdc-4499-b4d4-6e5478c1de36","Type":"ContainerStarted","Data":"c1a0b5e026dec11b026c8f8bf1086e083c8b1d083a35d53dedff3e7abc09c359"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.487754 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.488275 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:36.988255285 +0000 UTC m=+164.422925142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.492300 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" event={"ID":"f7de5a66-87ae-4f5f-8f21-f9f6bff749da","Type":"ContainerStarted","Data":"9b8a54d5ab5de30108343d90e67c33a5f632a8c547b7c85268a6aacf211ab89e"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.503607 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" event={"ID":"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d","Type":"ContainerStarted","Data":"4b0687231ef46f1df1ea3301976b5482d48f8ffea2f118b12df9738514bf5a3a"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.503664 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" event={"ID":"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d","Type":"ContainerStarted","Data":"31c81bf182410af48f2ab29fd61cf1d7bde863858809722c8014d2d137706828"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.504509 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.510184 4676 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4627g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.510253 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" podUID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.513636 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" event={"ID":"eb280ecc-1666-4a9a-a2b3-910b09de7474","Type":"ContainerStarted","Data":"e225c190460497d2831e21e849cfde78c91a14859110d4d495526c01e20608d8"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.513692 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" event={"ID":"eb280ecc-1666-4a9a-a2b3-910b09de7474","Type":"ContainerStarted","Data":"19bf35a66c90549452626291c7a0e0f2359b1f7582999ddbc291ab50d932ec5a"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.519668 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" event={"ID":"ed5477e6-0f8c-457f-a314-6a8263aa89ac","Type":"ContainerStarted","Data":"0097ae1ff6daa6c83a12cb3243102609df1fef98b68a6de3c9d49761697021c2"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.545726 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" event={"ID":"e89c9638-4420-465f-b9f4-0afe798f1610","Type":"ContainerStarted","Data":"82afcd50c8de0aa05fa5365bc5f97bba33bf0ff1370aaadfa38a3c44a3f7984c"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.545782 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" event={"ID":"e89c9638-4420-465f-b9f4-0afe798f1610","Type":"ContainerStarted","Data":"d059dd8220a5123b95d704d8a6f44bfe427ab3f5a8f42cc3846dd53ae9b41338"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.557730 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" podStartSLOduration=137.557705138 podStartE2EDuration="2m17.557705138s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:36.556733681 +0000 UTC m=+163.991403538" watchObservedRunningTime="2025-12-04 15:22:36.557705138 +0000 UTC m=+163.992374995" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.591108 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.593273 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.093245132 +0000 UTC m=+164.527914989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.680239 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jc9zk" podStartSLOduration=138.680214514 podStartE2EDuration="2m18.680214514s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:36.67785639 +0000 UTC m=+164.112526257" watchObservedRunningTime="2025-12-04 15:22:36.680214514 +0000 UTC m=+164.114884371" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.695447 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.697797 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.197779835 +0000 UTC m=+164.632449692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.698148 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mtj84" event={"ID":"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362","Type":"ContainerStarted","Data":"a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.731769 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-mtj84" podStartSLOduration=138.731746616 podStartE2EDuration="2m18.731746616s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:36.726868132 +0000 UTC m=+164.161537989" watchObservedRunningTime="2025-12-04 15:22:36.731746616 +0000 UTC m=+164.166416473" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.758231 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" event={"ID":"2352b624-13d5-49ce-ac83-0a72f19879af","Type":"ContainerStarted","Data":"dbdf0ac03a547eba8e7557ff077e82ff97764b8d850783339be00ae424505e75"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.782446 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qlskj" event={"ID":"64c8acea-9343-42d1-84cc-168d575e30a5","Type":"ContainerStarted","Data":"ee263bfe9998f089d7d97b24dc775c2653301c5b2a5b1c43c49740ea9a58f0b6"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.791197 4676 patch_prober.go:28] interesting pod/console-operator-58897d9998-qlskj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.791284 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qlskj" podUID="64c8acea-9343-42d1-84cc-168d575e30a5" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.797014 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.799555 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.299535813 +0000 UTC m=+164.734205660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.825162 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" event={"ID":"54fb0764-8ac7-48d5-87ce-e2c15115ae6a","Type":"ContainerStarted","Data":"e1bede2ebd0879170e27026be489760f84d5e6c649cbca13728e13f962723ef1"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.829066 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:36 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:36 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:36 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.829132 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.846783 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" event={"ID":"71b79282-23b9-4bfd-b5b9-446f82131905","Type":"ContainerStarted","Data":"ffcb8e0ebd9f099fc24c81c189eefdacb6f5c61bf18b3955f2a41a82e3d7a4af"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.855402 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" event={"ID":"d35d3a3f-f614-45fa-a59a-e5cefa471321","Type":"ContainerStarted","Data":"ef5dfe9325db7a54c3641d571301202da5c6ddcc88855e02a3e6042b0e4ae03e"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.864391 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" event={"ID":"86926fca-c917-498b-a3f3-7315ec1e5370","Type":"ContainerStarted","Data":"cfcea0e6bf842e166137f6663247f2a2c359eb8f3c838ad03deead4f5892b3cd"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.867133 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.880011 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" podStartSLOduration=137.879983217 podStartE2EDuration="2m17.879983217s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:36.879096183 +0000 UTC m=+164.313766040" watchObservedRunningTime="2025-12-04 15:22:36.879983217 +0000 UTC m=+164.314653074" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.880336 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" event={"ID":"9b6db772-e434-4619-b2e3-bacb9b4c527a","Type":"ContainerStarted","Data":"dc4e19665fd22a293bd55ebab636820467d57011040d001e76ac75b671fba303"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.884428 4676 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-ls5xb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.884522 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" podUID="86926fca-c917-498b-a3f3-7315ec1e5370" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.898645 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" event={"ID":"f08aef24-f00f-43da-8ac1-79def39914ce","Type":"ContainerStarted","Data":"7618f3c9456d71a1e6583f20d513fc308618055205f21055d527ceedae1d402c"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.900174 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:36 crc kubenswrapper[4676]: E1204 15:22:36.901525 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.401509597 +0000 UTC m=+164.836179454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.927729 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-qbw9s" event={"ID":"1348ed48-644b-49f3-b674-92cd4e39d1ec","Type":"ContainerStarted","Data":"4acc4774133bb85aef35eeae716e08983b55cc1142b5dd45210d760dde8ee73a"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.928940 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-qbw9s" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.949952 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qbw9s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.950024 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qbw9s" podUID="1348ed48-644b-49f3-b674-92cd4e39d1ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.950347 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" event={"ID":"29205e6d-74be-4a99-b92d-50152cb21845","Type":"ContainerStarted","Data":"9923390f0faeac6172718ee649ba2047eefd2912d7ab95319c2c7f5bcb316019"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.961554 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" event={"ID":"65156769-02c6-4cb1-a9ff-c51c8b458135","Type":"ContainerStarted","Data":"f7353faa20714bf206a72e9fcc80d80166d5a042ed735b084ac75c4f8d3896ed"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.978722 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" podStartSLOduration=137.978702532 podStartE2EDuration="2m17.978702532s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:36.977263513 +0000 UTC m=+164.411933370" watchObservedRunningTime="2025-12-04 15:22:36.978702532 +0000 UTC m=+164.413372389" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.979798 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" podStartSLOduration=137.979785062 podStartE2EDuration="2m17.979785062s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:36.92313368 +0000 UTC m=+164.357803557" watchObservedRunningTime="2025-12-04 15:22:36.979785062 +0000 UTC m=+164.414454919" Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.989238 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" event={"ID":"a75359a0-583e-4732-a043-4088c2ca0910","Type":"ContainerStarted","Data":"f8ad1828e451ce755c4744804b203ac9e04a017acd0dc003efff2a91eb743b0b"} Dec 04 15:22:36 crc kubenswrapper[4676]: I1204 15:22:36.989310 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" event={"ID":"a75359a0-583e-4732-a043-4088c2ca0910","Type":"ContainerStarted","Data":"59404fcbbec33ac01a035ee271a9264e5a348159efd7f63426f8879ab14e4330"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.008217 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.011380 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.511337086 +0000 UTC m=+164.946007033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.038038 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" event={"ID":"ae863415-6074-4ce2-9e25-8c0705ed1e80","Type":"ContainerStarted","Data":"129e0960e85fbf4e48477a38a8284381b5888b0032bfe354c907fbd81a6e7743"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.038104 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" event={"ID":"ae863415-6074-4ce2-9e25-8c0705ed1e80","Type":"ContainerStarted","Data":"19c3204d9a933ed8b84ddef524ad3a4465a2aa87068121ca1b2ad11f62722bed"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.068696 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" podStartSLOduration=138.068674767 podStartE2EDuration="2m18.068674767s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:37.011264634 +0000 UTC m=+164.445934491" watchObservedRunningTime="2025-12-04 15:22:37.068674767 +0000 UTC m=+164.503344624" Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.077685 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5j6kp" event={"ID":"7817860b-74ba-4dec-b243-6f3571884745","Type":"ContainerStarted","Data":"701a22876262a84608bab45d01a1f73b43ab6c59626c59f07bc7d100ff35547a"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.077739 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5j6kp" event={"ID":"7817860b-74ba-4dec-b243-6f3571884745","Type":"ContainerStarted","Data":"f9e65d9aadd7c2e9b89a3847c637512d2f075318c89fcf51c74066eaa7b31676"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.093524 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" event={"ID":"559634f6-983d-4ae2-959e-8b54abc1326d","Type":"ContainerStarted","Data":"609c63f7d9c57b8e1142a66507a18328ebe08f063b8917a0f04c30421ece2a1e"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.111289 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.111600 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.611586513 +0000 UTC m=+165.046256370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.112916 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" event={"ID":"d5d16762-1e73-4856-9593-ae335bce123b","Type":"ContainerStarted","Data":"6d7861492dbf23c20441632373c68fe6cfbfc9ab39119a0ca5285b99da55e701"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.122992 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wk9bw" event={"ID":"79d432ec-ac07-4516-a0a0-38fc02ec3e80","Type":"ContainerStarted","Data":"c22b9c893b77dfa8fcad87c8794e62e8057cb92f93d07b4d4ee6b6db62c8183c"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.123046 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wk9bw" event={"ID":"79d432ec-ac07-4516-a0a0-38fc02ec3e80","Type":"ContainerStarted","Data":"a78bd6a0c3b049cf62716102feeb3a109697b618913e49b38117323564afebd6"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.129165 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k55v7" Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.143167 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" event={"ID":"cfa82d87-b071-46fc-af14-295ff38871aa","Type":"ContainerStarted","Data":"578f836c7bae2e3c36a06a6acab401bd72d052bd090423e2c4e8f37e8a3400a2"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.143227 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" event={"ID":"cfa82d87-b071-46fc-af14-295ff38871aa","Type":"ContainerStarted","Data":"5a24f36fd3b22be87733e5bca05f26f1c3b7b213729c3c07eb9923641a56c152"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.157813 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5j6kp" podStartSLOduration=10.157794279 podStartE2EDuration="10.157794279s" podCreationTimestamp="2025-12-04 15:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:37.110106302 +0000 UTC m=+164.544776159" watchObservedRunningTime="2025-12-04 15:22:37.157794279 +0000 UTC m=+164.592464136" Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.193710 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" event={"ID":"6bbe37e1-bbb1-4298-8427-f8c233470593","Type":"ContainerStarted","Data":"aa97bb0dcd22217a87c59c0e665ec941ec193487f437eec422756a03b93c718c"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.194095 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" event={"ID":"6bbe37e1-bbb1-4298-8427-f8c233470593","Type":"ContainerStarted","Data":"4b678ad9ef0c5b08b763cda8e14ac8d4d5d37d8a75382b29726f2138a93d9651"} Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.195181 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nr6vs" Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.214120 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.214302 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.714277686 +0000 UTC m=+165.148947533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.214830 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.216524 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.716503777 +0000 UTC m=+165.151173694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.314804 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2gzzj" podStartSLOduration=138.31477941 podStartE2EDuration="2m18.31477941s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:37.313628158 +0000 UTC m=+164.748298035" watchObservedRunningTime="2025-12-04 15:22:37.31477941 +0000 UTC m=+164.749449257" Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.315946 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.316041 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.816018624 +0000 UTC m=+165.250688481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.316564 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.333242 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.833215325 +0000 UTC m=+165.267885232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.419110 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.420085 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:37.920058534 +0000 UTC m=+165.354728391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.521697 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.522224 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.022207143 +0000 UTC m=+165.456877000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.622538 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.622788 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.122755168 +0000 UTC m=+165.557425025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.623073 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.623480 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.123469797 +0000 UTC m=+165.558139654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.724572 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.724825 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.224777293 +0000 UTC m=+165.659447150 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.724943 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.725861 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.225843762 +0000 UTC m=+165.660513619 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.821544 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:37 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:37 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:37 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.821659 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.826173 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.826425 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.326392047 +0000 UTC m=+165.761061904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.826641 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.827004 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.326990873 +0000 UTC m=+165.761660730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.928006 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.928195 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.428165155 +0000 UTC m=+165.862835012 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:37 crc kubenswrapper[4676]: I1204 15:22:37.928381 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:37 crc kubenswrapper[4676]: E1204 15:22:37.928746 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.428737271 +0000 UTC m=+165.863407128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.029333 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.029546 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.529502322 +0000 UTC m=+165.964172179 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.029959 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.030283 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.530258332 +0000 UTC m=+165.964928189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.131926 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.132191 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.632151194 +0000 UTC m=+166.066821051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.132646 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.133266 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.633243934 +0000 UTC m=+166.067913791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.201505 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" event={"ID":"54fb0764-8ac7-48d5-87ce-e2c15115ae6a","Type":"ContainerStarted","Data":"8a0929661fea9a8834436ceb55f37eaa560cd7e8cbf398ee3ea6544b60e31047"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.203422 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" event={"ID":"ac97c016-fcdc-4499-b4d4-6e5478c1de36","Type":"ContainerStarted","Data":"9dbf9840623bed3e47740b0154eba64198fd61c2717bc19599efadd543129e80"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.204496 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p2mg9" event={"ID":"71b79282-23b9-4bfd-b5b9-446f82131905","Type":"ContainerStarted","Data":"89cc7753b1d708a8f3b3f11cc5a2123d5ac3c64b6247048a1e722ae33af663e5"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.205692 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" event={"ID":"ed5477e6-0f8c-457f-a314-6a8263aa89ac","Type":"ContainerStarted","Data":"e6db3459df572a9bd62c5145dbf05d4be8de02f8aecfbee83ba1c70c309c1f4e"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.205958 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.207990 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" event={"ID":"d35d3a3f-f614-45fa-a59a-e5cefa471321","Type":"ContainerStarted","Data":"b5c62d7e3b199afb0b2bcb3eccdd6ff6cdf5e89ca004876db6b9ed13fc69a4d0"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.208177 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.209651 4676 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-675c2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.209690 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" podUID="d35d3a3f-f614-45fa-a59a-e5cefa471321" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.209741 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" event={"ID":"f08aef24-f00f-43da-8ac1-79def39914ce","Type":"ContainerStarted","Data":"9586c5056c9e6b3c89c9132c8187ca716561308a8e337e50a039a3711f1b06f5"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.211297 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" event={"ID":"86926fca-c917-498b-a3f3-7315ec1e5370","Type":"ContainerStarted","Data":"572378bf6f14052e5617b0daef9b4877c506590c0082c0e09825997bfeab704a"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.212913 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" event={"ID":"9b6db772-e434-4619-b2e3-bacb9b4c527a","Type":"ContainerStarted","Data":"1d4b9910e5583c6cc24ed8bc874c269196eed0c98e3ab2cad35b05c0d044c41a"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.214087 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5hd4h" event={"ID":"29205e6d-74be-4a99-b92d-50152cb21845","Type":"ContainerStarted","Data":"c05a5a9f62d0f2f2afecfb5784275ef5716408d6fa328c71e0ab20facc0c1b46"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.218109 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-ls5xb" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.218300 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" event={"ID":"f7de5a66-87ae-4f5f-8f21-f9f6bff749da","Type":"ContainerStarted","Data":"041c7ba9697dda1b1e2dc823807878ca1e03dd0fb4e501f50365b2f32bdd8d30"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.220629 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" event={"ID":"d5d16762-1e73-4856-9593-ae335bce123b","Type":"ContainerStarted","Data":"89498618d8ab4483f589ccbaba9ec5d721824e28a73eeb905eaa7e0d3c0c212d"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.220659 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" event={"ID":"d5d16762-1e73-4856-9593-ae335bce123b","Type":"ContainerStarted","Data":"c0ab3cdd6c06a1e32a327b6fa261540838347c27b59868df90e726af02c60777"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.222644 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" event={"ID":"2352b624-13d5-49ce-ac83-0a72f19879af","Type":"ContainerStarted","Data":"3657dd71272b31a690ae0bc612d365104324f419493504660e900010282af024"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.222670 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" event={"ID":"2352b624-13d5-49ce-ac83-0a72f19879af","Type":"ContainerStarted","Data":"23974260306e3ec47a88bb14c017b5a649c667a9648c3b73545486f5b2e46b1a"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.225780 4676 generic.go:334] "Generic (PLEG): container finished" podID="ae863415-6074-4ce2-9e25-8c0705ed1e80" containerID="129e0960e85fbf4e48477a38a8284381b5888b0032bfe354c907fbd81a6e7743" exitCode=0 Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.225848 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" event={"ID":"ae863415-6074-4ce2-9e25-8c0705ed1e80","Type":"ContainerDied","Data":"129e0960e85fbf4e48477a38a8284381b5888b0032bfe354c907fbd81a6e7743"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.225930 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" event={"ID":"ae863415-6074-4ce2-9e25-8c0705ed1e80","Type":"ContainerStarted","Data":"08877b5ef064181bc44f2169557eefa98bd37576c7ad036bdc4df82850ca2acf"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.227476 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.228183 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wk9bw" event={"ID":"79d432ec-ac07-4516-a0a0-38fc02ec3e80","Type":"ContainerStarted","Data":"70e2f8670c4ab846f303d7e00664168da4763c1bf0e13c1635973834e0790014"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.228868 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.230920 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nltr4" event={"ID":"a75359a0-583e-4732-a043-4088c2ca0910","Type":"ContainerStarted","Data":"3478f9771a224c97c7b2eebfda39b8a2c959f387a425f7699855566ebba634aa"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.233239 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" event={"ID":"cfa82d87-b071-46fc-af14-295ff38871aa","Type":"ContainerStarted","Data":"4a5b16e2bcb0f43b72cd210c01e67eb4766798ddd189b14ad88e7d93b3518d44"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.234334 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.234513 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.734484238 +0000 UTC m=+166.169154105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.234934 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.236261 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.736240666 +0000 UTC m=+166.170910523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.239937 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" event={"ID":"daa64ebc-2612-4a0c-833e-be450fbbd5d0","Type":"ContainerStarted","Data":"68b7984aa978cfaa97649563029cb7f60581f4c8042338841f0cfde5163dad1a"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.243034 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" event={"ID":"65156769-02c6-4cb1-a9ff-c51c8b458135","Type":"ContainerStarted","Data":"92f0b3a6045cbd1b96ac2e6f5a530bdd8961f8264df0482ec2d4e8c11d577a5c"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.246157 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" event={"ID":"e89c9638-4420-465f-b9f4-0afe798f1610","Type":"ContainerStarted","Data":"b059a25270e2391db0c4ceefe0b8b451463e39ab59f57de8f29b0a10d9f0548d"} Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.249720 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jmxc7" podStartSLOduration=140.249701275 podStartE2EDuration="2m20.249701275s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.246071195 +0000 UTC m=+165.680741042" watchObservedRunningTime="2025-12-04 15:22:38.249701275 +0000 UTC m=+165.684371132" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.250933 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qbw9s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.251408 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qbw9s" podUID="1348ed48-644b-49f3-b674-92cd4e39d1ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.253384 4676 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4627g container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.253506 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" podUID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.277280 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qlskj" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.339298 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.343967 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-v8rmq" podStartSLOduration=139.343946967 podStartE2EDuration="2m19.343946967s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.298170673 +0000 UTC m=+165.732840530" watchObservedRunningTime="2025-12-04 15:22:38.343946967 +0000 UTC m=+165.778616834" Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.367354 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:38.867326197 +0000 UTC m=+166.301996054 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.390637 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rbngc" podStartSLOduration=139.390620125 podStartE2EDuration="2m19.390620125s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.344625275 +0000 UTC m=+165.779295142" watchObservedRunningTime="2025-12-04 15:22:38.390620125 +0000 UTC m=+165.825289982" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.391312 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.391349 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.424042 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" podStartSLOduration=140.424023381 podStartE2EDuration="2m20.424023381s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.421146282 +0000 UTC m=+165.855816149" watchObservedRunningTime="2025-12-04 15:22:38.424023381 +0000 UTC m=+165.858693238" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.427316 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.541067 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.545649 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.045618182 +0000 UTC m=+166.480288039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.566693 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-njwq9" podStartSLOduration=139.566655848 podStartE2EDuration="2m19.566655848s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.562108734 +0000 UTC m=+165.996778591" watchObservedRunningTime="2025-12-04 15:22:38.566655848 +0000 UTC m=+166.001325705" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.602549 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-vqkqz" podStartSLOduration=139.602531241 podStartE2EDuration="2m19.602531241s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.600201377 +0000 UTC m=+166.034871234" watchObservedRunningTime="2025-12-04 15:22:38.602531241 +0000 UTC m=+166.037201098" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.634726 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4h6zp" podStartSLOduration=140.634702213 podStartE2EDuration="2m20.634702213s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.632293777 +0000 UTC m=+166.066963654" watchObservedRunningTime="2025-12-04 15:22:38.634702213 +0000 UTC m=+166.069372070" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.643087 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.644514 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.144489461 +0000 UTC m=+166.579159328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.645027 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.645708 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.145694674 +0000 UTC m=+166.580364531 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.685613 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-k7tn2" podStartSLOduration=140.685596787 podStartE2EDuration="2m20.685596787s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.684169538 +0000 UTC m=+166.118839395" watchObservedRunningTime="2025-12-04 15:22:38.685596787 +0000 UTC m=+166.120266644" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.734754 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-z4wmg" podStartSLOduration=139.734729893 podStartE2EDuration="2m19.734729893s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.729400637 +0000 UTC m=+166.164070494" watchObservedRunningTime="2025-12-04 15:22:38.734729893 +0000 UTC m=+166.169399750" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.747252 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.747971 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.247950135 +0000 UTC m=+166.682619992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.768610 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nvsfq" podStartSLOduration=139.768591901 podStartE2EDuration="2m19.768591901s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.765977169 +0000 UTC m=+166.200647026" watchObservedRunningTime="2025-12-04 15:22:38.768591901 +0000 UTC m=+166.203261758" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.802742 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" podStartSLOduration=140.802722316 podStartE2EDuration="2m20.802722316s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.800249658 +0000 UTC m=+166.234919515" watchObservedRunningTime="2025-12-04 15:22:38.802722316 +0000 UTC m=+166.237392173" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.833880 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:38 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:38 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:38 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.833981 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.860494 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.860806 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.360794927 +0000 UTC m=+166.795464784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.906712 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wk9bw" podStartSLOduration=11.906692824 podStartE2EDuration="11.906692824s" podCreationTimestamp="2025-12-04 15:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.859411869 +0000 UTC m=+166.294081736" watchObservedRunningTime="2025-12-04 15:22:38.906692824 +0000 UTC m=+166.341362681" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.944409 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" podStartSLOduration=139.944391157 podStartE2EDuration="2m19.944391157s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:38.941985091 +0000 UTC m=+166.376654968" watchObservedRunningTime="2025-12-04 15:22:38.944391157 +0000 UTC m=+166.379061014" Dec 04 15:22:38 crc kubenswrapper[4676]: I1204 15:22:38.961496 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:38 crc kubenswrapper[4676]: E1204 15:22:38.961957 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.461930948 +0000 UTC m=+166.896600805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.063082 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.063533 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.563516151 +0000 UTC m=+166.998186008 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.164610 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.164861 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.664824127 +0000 UTC m=+167.099493984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.165049 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.165571 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.665546846 +0000 UTC m=+167.100216703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.402234 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.403137 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:39.903094495 +0000 UTC m=+167.337764352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.445310 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" event={"ID":"ae863415-6074-4ce2-9e25-8c0705ed1e80","Type":"ContainerStarted","Data":"5be09db0a110538d9849dd63ea4b30148dcba343b2d97322bd5f80cb67f59829"} Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.447406 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qbw9s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.447509 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qbw9s" podUID="1348ed48-644b-49f3-b674-92cd4e39d1ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.452537 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.452603 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.462267 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2rvct" Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.502679 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" podStartSLOduration=141.502625921 podStartE2EDuration="2m21.502625921s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:39.498028245 +0000 UTC m=+166.932698122" watchObservedRunningTime="2025-12-04 15:22:39.502625921 +0000 UTC m=+166.937295778" Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.504652 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.505337 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.005321235 +0000 UTC m=+167.439991102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.575170 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.607532 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.608725 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.108703457 +0000 UTC m=+167.543373314 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.718715 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.719060 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.21903777 +0000 UTC m=+167.653707627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.872817 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.873360 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.373342817 +0000 UTC m=+167.808012664 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.877835 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:39 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:39 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:39 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.877889 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:39 crc kubenswrapper[4676]: I1204 15:22:39.975107 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:39 crc kubenswrapper[4676]: E1204 15:22:39.975479 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.475463685 +0000 UTC m=+167.910133552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.019752 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zd784"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.043206 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.047408 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.073143 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ml7rm"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.075518 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.081736 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.082171 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.090180 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.587400102 +0000 UTC m=+168.022069979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.115075 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml7rm"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.131946 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zd784"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.173689 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-srs6p"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.176612 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.179098 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srs6p"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.184667 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-catalog-content\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.184721 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbc79\" (UniqueName: \"kubernetes.io/projected/009171f0-c033-4ea6-b46d-0155fe9f3e71-kube-api-access-wbc79\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.184753 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-utilities\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.184829 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.184852 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-utilities\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.184875 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7llgq\" (UniqueName: \"kubernetes.io/projected/a945f156-c10a-4132-8fb4-e43040790a01-kube-api-access-7llgq\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.184979 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-catalog-content\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.185275 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.685254343 +0000 UTC m=+168.119924280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.236103 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.236797 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.243357 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.244184 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.254472 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289479 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289664 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-catalog-content\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289689 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-utilities\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289742 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbml\" (UniqueName: \"kubernetes.io/projected/8e48d278-595d-4cee-a3c7-ca1cf46a2184-kube-api-access-fzbml\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-catalog-content\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289792 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbc79\" (UniqueName: \"kubernetes.io/projected/009171f0-c033-4ea6-b46d-0155fe9f3e71-kube-api-access-wbc79\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289817 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-utilities\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289885 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-utilities\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289925 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7llgq\" (UniqueName: \"kubernetes.io/projected/a945f156-c10a-4132-8fb4-e43040790a01-kube-api-access-7llgq\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.289954 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-catalog-content\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.290544 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-catalog-content\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.290628 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.7906127 +0000 UTC m=+168.225282547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.291283 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-utilities\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.291503 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-catalog-content\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.293593 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-utilities\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.317004 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dl6md"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.326301 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.328300 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7llgq\" (UniqueName: \"kubernetes.io/projected/a945f156-c10a-4132-8fb4-e43040790a01-kube-api-access-7llgq\") pod \"community-operators-ml7rm\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.330024 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbc79\" (UniqueName: \"kubernetes.io/projected/009171f0-c033-4ea6-b46d-0155fe9f3e71-kube-api-access-wbc79\") pod \"certified-operators-zd784\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.360320 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dl6md"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.392677 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-catalog-content\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.392726 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-utilities\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.392757 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.392822 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbml\" (UniqueName: \"kubernetes.io/projected/8e48d278-595d-4cee-a3c7-ca1cf46a2184-kube-api-access-fzbml\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.392861 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.392941 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.393263 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.893245941 +0000 UTC m=+168.327915818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.393306 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-utilities\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.393541 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-catalog-content\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.393829 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.425460 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.455936 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbml\" (UniqueName: \"kubernetes.io/projected/8e48d278-595d-4cee-a3c7-ca1cf46a2184-kube-api-access-fzbml\") pod \"certified-operators-srs6p\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.469588 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" event={"ID":"65156769-02c6-4cb1-a9ff-c51c8b458135","Type":"ContainerStarted","Data":"65b17b041bc1c1ae2d72528d385f55d25cdd7337e80287a08bbe4f2b388e6e02"} Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.497467 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.497728 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.497759 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-catalog-content\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.497792 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-utilities\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.497859 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4cw9\" (UniqueName: \"kubernetes.io/projected/69c815dc-b379-4325-90a5-2a86fc80b7e5-kube-api-access-s4cw9\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.497890 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.498068 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.498261 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:40.998232308 +0000 UTC m=+168.432902155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.512224 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.516776 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.579893 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.603250 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.603512 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4cw9\" (UniqueName: \"kubernetes.io/projected/69c815dc-b379-4325-90a5-2a86fc80b7e5-kube-api-access-s4cw9\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.603884 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-catalog-content\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.604105 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-utilities\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.604699 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-utilities\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.615043 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-catalog-content\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.622476 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.122449951 +0000 UTC m=+168.557119808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.706609 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.717990 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.217951688 +0000 UTC m=+168.652621545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.719736 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.724187 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.224168238 +0000 UTC m=+168.658838095 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.770869 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4cw9\" (UniqueName: \"kubernetes.io/projected/69c815dc-b379-4325-90a5-2a86fc80b7e5-kube-api-access-s4cw9\") pod \"community-operators-dl6md\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.822630 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.823172 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.32315403 +0000 UTC m=+168.757823887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.829420 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:40 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:40 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:40 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.829529 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.926133 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:40 crc kubenswrapper[4676]: E1204 15:22:40.926819 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.4268039 +0000 UTC m=+168.861473767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.951646 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zd784"] Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.962230 4676 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 04 15:22:40 crc kubenswrapper[4676]: I1204 15:22:40.975702 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:22:40 crc kubenswrapper[4676]: W1204 15:22:40.986129 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod009171f0_c033_4ea6_b46d_0155fe9f3e71.slice/crio-bccdb6b590f45ff8071e3c1756f0a461afee34fa14da31ed2fe4179bfc338d08 WatchSource:0}: Error finding container bccdb6b590f45ff8071e3c1756f0a461afee34fa14da31ed2fe4179bfc338d08: Status 404 returned error can't find the container with id bccdb6b590f45ff8071e3c1756f0a461afee34fa14da31ed2fe4179bfc338d08 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.004333 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-srs6p"] Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.026837 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.027010 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.526983864 +0000 UTC m=+168.961653721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.027120 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.027405 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.527393846 +0000 UTC m=+168.962063703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.077682 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.078006 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.093014 4676 patch_prober.go:28] interesting pod/console-f9d7485db-mtj84 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.093087 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mtj84" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.121005 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qbw9s container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.121096 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-qbw9s" podUID="1348ed48-644b-49f3-b674-92cd4e39d1ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.127892 4676 patch_prober.go:28] interesting pod/downloads-7954f5f757-qbw9s container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.127962 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-qbw9s" podUID="1348ed48-644b-49f3-b674-92cd4e39d1ec" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.128077 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.129103 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.629087122 +0000 UTC m=+169.063756979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.197568 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml7rm"] Dec 04 15:22:41 crc kubenswrapper[4676]: W1204 15:22:41.208000 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda945f156_c10a_4132_8fb4_e43040790a01.slice/crio-c513391450352cf3180e27eedb6d41b560cdf631d4e0b4d0fc7395392a0de392 WatchSource:0}: Error finding container c513391450352cf3180e27eedb6d41b560cdf631d4e0b4d0fc7395392a0de392: Status 404 returned error can't find the container with id c513391450352cf3180e27eedb6d41b560cdf631d4e0b4d0fc7395392a0de392 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.229317 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.229687 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.729670987 +0000 UTC m=+169.164340844 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.236232 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 04 15:22:41 crc kubenswrapper[4676]: W1204 15:22:41.255888 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0adb4767_7a44_4873_93bf_d85c7cc5faa6.slice/crio-15ec2f16e9910ac9961208d4ed8399c0521d45ef0ad9dc803d9ccfdfdc9bc322 WatchSource:0}: Error finding container 15ec2f16e9910ac9961208d4ed8399c0521d45ef0ad9dc803d9ccfdfdc9bc322: Status 404 returned error can't find the container with id 15ec2f16e9910ac9961208d4ed8399c0521d45ef0ad9dc803d9ccfdfdc9bc322 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.329861 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.330224 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.830159831 +0000 UTC m=+169.264829698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.330743 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.331514 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.831500717 +0000 UTC m=+169.266170574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.351449 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dl6md"] Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.370256 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.371655 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.431209 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.431348 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.931330752 +0000 UTC m=+169.366000609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.431478 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.431784 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:41.931776125 +0000 UTC m=+169.366445982 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: W1204 15:22:41.432238 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c815dc_b379_4325_90a5_2a86fc80b7e5.slice/crio-e4a76872b988c326f0d130d5df1270df978173c188f6e7776ad2f91c62201328 WatchSource:0}: Error finding container e4a76872b988c326f0d130d5df1270df978173c188f6e7776ad2f91c62201328: Status 404 returned error can't find the container with id e4a76872b988c326f0d130d5df1270df978173c188f6e7776ad2f91c62201328 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.475030 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0adb4767-7a44-4873-93bf-d85c7cc5faa6","Type":"ContainerStarted","Data":"15ec2f16e9910ac9961208d4ed8399c0521d45ef0ad9dc803d9ccfdfdc9bc322"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.477224 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml7rm" event={"ID":"a945f156-c10a-4132-8fb4-e43040790a01","Type":"ContainerStarted","Data":"8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.477353 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml7rm" event={"ID":"a945f156-c10a-4132-8fb4-e43040790a01","Type":"ContainerStarted","Data":"c513391450352cf3180e27eedb6d41b560cdf631d4e0b4d0fc7395392a0de392"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.478462 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl6md" event={"ID":"69c815dc-b379-4325-90a5-2a86fc80b7e5","Type":"ContainerStarted","Data":"e4a76872b988c326f0d130d5df1270df978173c188f6e7776ad2f91c62201328"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.482197 4676 generic.go:334] "Generic (PLEG): container finished" podID="daa64ebc-2612-4a0c-833e-be450fbbd5d0" containerID="68b7984aa978cfaa97649563029cb7f60581f4c8042338841f0cfde5163dad1a" exitCode=0 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.482346 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" event={"ID":"daa64ebc-2612-4a0c-833e-be450fbbd5d0","Type":"ContainerDied","Data":"68b7984aa978cfaa97649563029cb7f60581f4c8042338841f0cfde5163dad1a"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.491755 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" event={"ID":"65156769-02c6-4cb1-a9ff-c51c8b458135","Type":"ContainerStarted","Data":"04f55545e848c802d18894c2438279780aa9847bcd0ba04bfb5c390092dc2f52"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.491809 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" event={"ID":"65156769-02c6-4cb1-a9ff-c51c8b458135","Type":"ContainerStarted","Data":"5016f6c0eb00350a867f634c06ece7839280a548879bbdf0d0165830f6658b0e"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.498598 4676 generic.go:334] "Generic (PLEG): container finished" podID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerID="6de33229affe6b4b49e743d867488a601736f34be02f2a63e7419605b9e577c1" exitCode=0 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.498860 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd784" event={"ID":"009171f0-c033-4ea6-b46d-0155fe9f3e71","Type":"ContainerDied","Data":"6de33229affe6b4b49e743d867488a601736f34be02f2a63e7419605b9e577c1"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.499015 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd784" event={"ID":"009171f0-c033-4ea6-b46d-0155fe9f3e71","Type":"ContainerStarted","Data":"bccdb6b590f45ff8071e3c1756f0a461afee34fa14da31ed2fe4179bfc338d08"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.502067 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.504673 4676 generic.go:334] "Generic (PLEG): container finished" podID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerID="5e7e30e6a19597352591a28e6edb18f1a96027eddaaa321a7b0f4bed95f67503" exitCode=0 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.505413 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srs6p" event={"ID":"8e48d278-595d-4cee-a3c7-ca1cf46a2184","Type":"ContainerDied","Data":"5e7e30e6a19597352591a28e6edb18f1a96027eddaaa321a7b0f4bed95f67503"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.505446 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srs6p" event={"ID":"8e48d278-595d-4cee-a3c7-ca1cf46a2184","Type":"ContainerStarted","Data":"ad87eba11cf6c2683ed454a4d9ec5bd69265f69d77ee90e2bfd63af591a6c53b"} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.522697 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-4hnbc" podStartSLOduration=14.522674705 podStartE2EDuration="14.522674705s" podCreationTimestamp="2025-12-04 15:22:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:41.520126945 +0000 UTC m=+168.954796802" watchObservedRunningTime="2025-12-04 15:22:41.522674705 +0000 UTC m=+168.957344552" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.532466 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.532571 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-04 15:22:42.032548595 +0000 UTC m=+169.467218452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.533379 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: E1204 15:22:41.533750 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-04 15:22:42.033737978 +0000 UTC m=+169.468407835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lfwj6" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.549033 4676 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-04T15:22:40.962277992Z","Handler":null,"Name":""} Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.553127 4676 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.553192 4676 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.611286 4676 patch_prober.go:28] interesting pod/apiserver-76f77b778f-x25bq container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]log ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]etcd ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/generic-apiserver-start-informers ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/max-in-flight-filter ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 04 15:22:41 crc kubenswrapper[4676]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 04 15:22:41 crc kubenswrapper[4676]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/project.openshift.io-projectcache ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-startinformers ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 04 15:22:41 crc kubenswrapper[4676]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 04 15:22:41 crc kubenswrapper[4676]: livez check failed Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.611342 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" podUID="ae863415-6074-4ce2-9e25-8c0705ed1e80" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.634278 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.644187 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.720152 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2brr7"] Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.721422 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.723055 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.732972 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2brr7"] Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.735204 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-utilities\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.735261 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bw5n\" (UniqueName: \"kubernetes.io/projected/131c312c-f19d-4e87-8f86-8d38926b2d87-kube-api-access-2bw5n\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.735326 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.735355 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-catalog-content\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.738092 4676 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.738131 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.774179 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lfwj6\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.815852 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.819335 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:41 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:41 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:41 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.819407 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.836765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bw5n\" (UniqueName: \"kubernetes.io/projected/131c312c-f19d-4e87-8f86-8d38926b2d87-kube-api-access-2bw5n\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.836898 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-catalog-content\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.837031 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-utilities\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.837485 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-utilities\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.837592 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-catalog-content\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.914134 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:41 crc kubenswrapper[4676]: I1204 15:22:41.963576 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bw5n\" (UniqueName: \"kubernetes.io/projected/131c312c-f19d-4e87-8f86-8d38926b2d87-kube-api-access-2bw5n\") pod \"redhat-marketplace-2brr7\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.038781 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.120599 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7vsp8"] Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.122103 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.154108 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz4wl\" (UniqueName: \"kubernetes.io/projected/a656fb7b-4968-4459-a0fc-9fe6571ee582-kube-api-access-jz4wl\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.154195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-utilities\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.154299 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-catalog-content\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.307723 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz4wl\" (UniqueName: \"kubernetes.io/projected/a656fb7b-4968-4459-a0fc-9fe6571ee582-kube-api-access-jz4wl\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.307771 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-utilities\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.307935 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-catalog-content\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.309853 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vsp8"] Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.310182 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-catalog-content\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.310266 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-utilities\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.343173 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz4wl\" (UniqueName: \"kubernetes.io/projected/a656fb7b-4968-4459-a0fc-9fe6571ee582-kube-api-access-jz4wl\") pod \"redhat-marketplace-7vsp8\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.346771 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lfwj6"] Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.507823 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2brr7"] Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.513636 4676 generic.go:334] "Generic (PLEG): container finished" podID="0adb4767-7a44-4873-93bf-d85c7cc5faa6" containerID="61cb1aa5abd717cec0e97b6148211aab5906d96a7666168ba225fe4b1981e933" exitCode=0 Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.513723 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0adb4767-7a44-4873-93bf-d85c7cc5faa6","Type":"ContainerDied","Data":"61cb1aa5abd717cec0e97b6148211aab5906d96a7666168ba225fe4b1981e933"} Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.517696 4676 generic.go:334] "Generic (PLEG): container finished" podID="a945f156-c10a-4132-8fb4-e43040790a01" containerID="8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80" exitCode=0 Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.517798 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml7rm" event={"ID":"a945f156-c10a-4132-8fb4-e43040790a01","Type":"ContainerDied","Data":"8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80"} Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.520317 4676 generic.go:334] "Generic (PLEG): container finished" podID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerID="2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0" exitCode=0 Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.521093 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl6md" event={"ID":"69c815dc-b379-4325-90a5-2a86fc80b7e5","Type":"ContainerDied","Data":"2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0"} Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.523579 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" event={"ID":"8742ff93-db20-4d4e-84fa-a9c4276643ea","Type":"ContainerStarted","Data":"967ffe1e29c10f2822cfc6aee466e5c675a067371a0ce40b0329affd7ed8bbca"} Dec 04 15:22:42 crc kubenswrapper[4676]: W1204 15:22:42.525252 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131c312c_f19d_4e87_8f86_8d38926b2d87.slice/crio-04afd6b7ce3f5fddf49f379e8307f30d8def9163c18cc9f7dad187121b53754e WatchSource:0}: Error finding container 04afd6b7ce3f5fddf49f379e8307f30d8def9163c18cc9f7dad187121b53754e: Status 404 returned error can't find the container with id 04afd6b7ce3f5fddf49f379e8307f30d8def9163c18cc9f7dad187121b53754e Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.616549 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.819388 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.821541 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:42 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:42 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:42 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.821593 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.920415 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tx6hs"] Dec 04 15:22:42 crc kubenswrapper[4676]: E1204 15:22:42.920849 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa64ebc-2612-4a0c-833e-be450fbbd5d0" containerName="collect-profiles" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.920938 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa64ebc-2612-4a0c-833e-be450fbbd5d0" containerName="collect-profiles" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.921115 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa64ebc-2612-4a0c-833e-be450fbbd5d0" containerName="collect-profiles" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.922082 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.926518 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 15:22:42 crc kubenswrapper[4676]: I1204 15:22:42.939888 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tx6hs"] Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.022311 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv6ll\" (UniqueName: \"kubernetes.io/projected/daa64ebc-2612-4a0c-833e-be450fbbd5d0-kube-api-access-mv6ll\") pod \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.022579 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daa64ebc-2612-4a0c-833e-be450fbbd5d0-secret-volume\") pod \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.022621 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daa64ebc-2612-4a0c-833e-be450fbbd5d0-config-volume\") pod \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\" (UID: \"daa64ebc-2612-4a0c-833e-be450fbbd5d0\") " Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.022718 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdst\" (UniqueName: \"kubernetes.io/projected/1aa95312-1f71-4167-9982-352d67b49f03-kube-api-access-wmdst\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.022794 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-catalog-content\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.022875 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-utilities\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.023603 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa64ebc-2612-4a0c-833e-be450fbbd5d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "daa64ebc-2612-4a0c-833e-be450fbbd5d0" (UID: "daa64ebc-2612-4a0c-833e-be450fbbd5d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.029166 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa64ebc-2612-4a0c-833e-be450fbbd5d0-kube-api-access-mv6ll" (OuterVolumeSpecName: "kube-api-access-mv6ll") pod "daa64ebc-2612-4a0c-833e-be450fbbd5d0" (UID: "daa64ebc-2612-4a0c-833e-be450fbbd5d0"). InnerVolumeSpecName "kube-api-access-mv6ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.030437 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/daa64ebc-2612-4a0c-833e-be450fbbd5d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "daa64ebc-2612-4a0c-833e-be450fbbd5d0" (UID: "daa64ebc-2612-4a0c-833e-be450fbbd5d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.123697 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdst\" (UniqueName: \"kubernetes.io/projected/1aa95312-1f71-4167-9982-352d67b49f03-kube-api-access-wmdst\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.123816 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-catalog-content\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.123875 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-utilities\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.123955 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/daa64ebc-2612-4a0c-833e-be450fbbd5d0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.123971 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/daa64ebc-2612-4a0c-833e-be450fbbd5d0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.123980 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv6ll\" (UniqueName: \"kubernetes.io/projected/daa64ebc-2612-4a0c-833e-be450fbbd5d0-kube-api-access-mv6ll\") on node \"crc\" DevicePath \"\"" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.124589 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-utilities\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.124804 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-catalog-content\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.145834 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vsp8"] Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.146886 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdst\" (UniqueName: \"kubernetes.io/projected/1aa95312-1f71-4167-9982-352d67b49f03-kube-api-access-wmdst\") pod \"redhat-operators-tx6hs\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: W1204 15:22:43.157554 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda656fb7b_4968_4459_a0fc_9fe6571ee582.slice/crio-f173ff21187ee57676f5fa36bac3b2721072cb773e02aaba937461a6e14f1b0e WatchSource:0}: Error finding container f173ff21187ee57676f5fa36bac3b2721072cb773e02aaba937461a6e14f1b0e: Status 404 returned error can't find the container with id f173ff21187ee57676f5fa36bac3b2721072cb773e02aaba937461a6e14f1b0e Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.296852 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.322588 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jnsft"] Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.323888 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.326279 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-catalog-content\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.326329 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-424g5\" (UniqueName: \"kubernetes.io/projected/2b3eb3b7-9f03-46b3-890d-27429ead00a7-kube-api-access-424g5\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.326379 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-utilities\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.347883 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jnsft"] Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.407428 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.428684 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-catalog-content\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.428729 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-424g5\" (UniqueName: \"kubernetes.io/projected/2b3eb3b7-9f03-46b3-890d-27429ead00a7-kube-api-access-424g5\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.428775 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-utilities\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.429263 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-utilities\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.432366 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-catalog-content\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.481786 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-424g5\" (UniqueName: \"kubernetes.io/projected/2b3eb3b7-9f03-46b3-890d-27429ead00a7-kube-api-access-424g5\") pod \"redhat-operators-jnsft\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.595563 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" event={"ID":"8742ff93-db20-4d4e-84fa-a9c4276643ea","Type":"ContainerStarted","Data":"11af11341549c246de2b757fb4020d95509901941c758904ef8024dbf003f204"} Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.595877 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.612121 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" event={"ID":"daa64ebc-2612-4a0c-833e-be450fbbd5d0","Type":"ContainerDied","Data":"0ddeb78d4851d219366fb49cdba1856aa5738f1f12cf4021ec533e98fb2cb108"} Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.612188 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.612193 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ddeb78d4851d219366fb49cdba1856aa5738f1f12cf4021ec533e98fb2cb108" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.616563 4676 generic.go:334] "Generic (PLEG): container finished" podID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerID="8f6b124a5f33a177645e479a8449c62b74d2b3379e936ddf189299bccc5dc731" exitCode=0 Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.616653 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vsp8" event={"ID":"a656fb7b-4968-4459-a0fc-9fe6571ee582","Type":"ContainerDied","Data":"8f6b124a5f33a177645e479a8449c62b74d2b3379e936ddf189299bccc5dc731"} Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.616690 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vsp8" event={"ID":"a656fb7b-4968-4459-a0fc-9fe6571ee582","Type":"ContainerStarted","Data":"f173ff21187ee57676f5fa36bac3b2721072cb773e02aaba937461a6e14f1b0e"} Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.627527 4676 generic.go:334] "Generic (PLEG): container finished" podID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerID="5ea13300ee3dc3a117c4795d066fdf9a06abed26bd2d8fe9e5eac05c915caabd" exitCode=0 Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.628392 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2brr7" event={"ID":"131c312c-f19d-4e87-8f86-8d38926b2d87","Type":"ContainerDied","Data":"5ea13300ee3dc3a117c4795d066fdf9a06abed26bd2d8fe9e5eac05c915caabd"} Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.628419 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2brr7" event={"ID":"131c312c-f19d-4e87-8f86-8d38926b2d87","Type":"ContainerStarted","Data":"04afd6b7ce3f5fddf49f379e8307f30d8def9163c18cc9f7dad187121b53754e"} Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.637355 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" podStartSLOduration=145.637335121 podStartE2EDuration="2m25.637335121s" podCreationTimestamp="2025-12-04 15:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:43.615700448 +0000 UTC m=+171.050370305" watchObservedRunningTime="2025-12-04 15:22:43.637335121 +0000 UTC m=+171.072004978" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.652441 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.665686 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tx6hs"] Dec 04 15:22:43 crc kubenswrapper[4676]: W1204 15:22:43.698783 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa95312_1f71_4167_9982_352d67b49f03.slice/crio-5725bf4099cdb8385d3f8d0d566605866978fb3279a281b1edd37781f03148dc WatchSource:0}: Error finding container 5725bf4099cdb8385d3f8d0d566605866978fb3279a281b1edd37781f03148dc: Status 404 returned error can't find the container with id 5725bf4099cdb8385d3f8d0d566605866978fb3279a281b1edd37781f03148dc Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.830272 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:43 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:43 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:43 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.830344 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:43 crc kubenswrapper[4676]: I1204 15:22:43.891682 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.038522 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kubelet-dir\") pod \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.038666 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0adb4767-7a44-4873-93bf-d85c7cc5faa6" (UID: "0adb4767-7a44-4873-93bf-d85c7cc5faa6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.038953 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kube-api-access\") pod \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\" (UID: \"0adb4767-7a44-4873-93bf-d85c7cc5faa6\") " Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.039404 4676 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.053286 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0adb4767-7a44-4873-93bf-d85c7cc5faa6" (UID: "0adb4767-7a44-4873-93bf-d85c7cc5faa6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.058099 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jnsft"] Dec 04 15:22:44 crc kubenswrapper[4676]: W1204 15:22:44.078711 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3eb3b7_9f03_46b3_890d_27429ead00a7.slice/crio-5df3781b0ba7c3234857d068fa90688c1f5b3a1099020ed33519bf9271c45002 WatchSource:0}: Error finding container 5df3781b0ba7c3234857d068fa90688c1f5b3a1099020ed33519bf9271c45002: Status 404 returned error can't find the container with id 5df3781b0ba7c3234857d068fa90688c1f5b3a1099020ed33519bf9271c45002 Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.141229 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adb4767-7a44-4873-93bf-d85c7cc5faa6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.446934 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.452923 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/711742b9-8c03-4234-ae1d-4d7d3baa4217-metrics-certs\") pod \"network-metrics-daemon-nsvkq\" (UID: \"711742b9-8c03-4234-ae1d-4d7d3baa4217\") " pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.614682 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nsvkq" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.635390 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnsft" event={"ID":"2b3eb3b7-9f03-46b3-890d-27429ead00a7","Type":"ContainerStarted","Data":"5df3781b0ba7c3234857d068fa90688c1f5b3a1099020ed33519bf9271c45002"} Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.638214 4676 generic.go:334] "Generic (PLEG): container finished" podID="1aa95312-1f71-4167-9982-352d67b49f03" containerID="3cb7423829b9d7255b8a038092e56073f245b18cde414f9b93a141f9f06e8eb9" exitCode=0 Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.638562 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx6hs" event={"ID":"1aa95312-1f71-4167-9982-352d67b49f03","Type":"ContainerDied","Data":"3cb7423829b9d7255b8a038092e56073f245b18cde414f9b93a141f9f06e8eb9"} Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.638620 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx6hs" event={"ID":"1aa95312-1f71-4167-9982-352d67b49f03","Type":"ContainerStarted","Data":"5725bf4099cdb8385d3f8d0d566605866978fb3279a281b1edd37781f03148dc"} Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.650750 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"0adb4767-7a44-4873-93bf-d85c7cc5faa6","Type":"ContainerDied","Data":"15ec2f16e9910ac9961208d4ed8399c0521d45ef0ad9dc803d9ccfdfdc9bc322"} Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.650798 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15ec2f16e9910ac9961208d4ed8399c0521d45ef0ad9dc803d9ccfdfdc9bc322" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.650797 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.820466 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:44 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:44 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:44 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:44 crc kubenswrapper[4676]: I1204 15:22:44.820569 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:45 crc kubenswrapper[4676]: I1204 15:22:45.137728 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nsvkq"] Dec 04 15:22:45 crc kubenswrapper[4676]: W1204 15:22:45.162149 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711742b9_8c03_4234_ae1d_4d7d3baa4217.slice/crio-330a1af0308dce791b1a35c81ba1630abd320adce2c4e1f7982a5dea9770a2bd WatchSource:0}: Error finding container 330a1af0308dce791b1a35c81ba1630abd320adce2c4e1f7982a5dea9770a2bd: Status 404 returned error can't find the container with id 330a1af0308dce791b1a35c81ba1630abd320adce2c4e1f7982a5dea9770a2bd Dec 04 15:22:45 crc kubenswrapper[4676]: I1204 15:22:45.699821 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" event={"ID":"711742b9-8c03-4234-ae1d-4d7d3baa4217","Type":"ContainerStarted","Data":"330a1af0308dce791b1a35c81ba1630abd320adce2c4e1f7982a5dea9770a2bd"} Dec 04 15:22:45 crc kubenswrapper[4676]: I1204 15:22:45.724299 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerID="4441381fae54264918b73263ba94e2cd1751b183fbbdd7d780f9a5ffe90325b4" exitCode=0 Dec 04 15:22:45 crc kubenswrapper[4676]: I1204 15:22:45.724549 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnsft" event={"ID":"2b3eb3b7-9f03-46b3-890d-27429ead00a7","Type":"ContainerDied","Data":"4441381fae54264918b73263ba94e2cd1751b183fbbdd7d780f9a5ffe90325b4"} Dec 04 15:22:45 crc kubenswrapper[4676]: I1204 15:22:45.821325 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:45 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:45 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:45 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:45 crc kubenswrapper[4676]: I1204 15:22:45.821408 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.026943 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.027021 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.186622 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 15:22:46 crc kubenswrapper[4676]: E1204 15:22:46.186975 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0adb4767-7a44-4873-93bf-d85c7cc5faa6" containerName="pruner" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.187009 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0adb4767-7a44-4873-93bf-d85c7cc5faa6" containerName="pruner" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.187171 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0adb4767-7a44-4873-93bf-d85c7cc5faa6" containerName="pruner" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.187745 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.190758 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.190979 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.204634 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.286042 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wk9bw" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.312873 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333def7-d665-44a4-881b-c86b0be58352-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.313058 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5333def7-d665-44a4-881b-c86b0be58352-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.377525 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.384445 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-x25bq" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.417192 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333def7-d665-44a4-881b-c86b0be58352-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.417261 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5333def7-d665-44a4-881b-c86b0be58352-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.418517 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5333def7-d665-44a4-881b-c86b0be58352-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.466706 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333def7-d665-44a4-881b-c86b0be58352-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.518255 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.750522 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" event={"ID":"711742b9-8c03-4234-ae1d-4d7d3baa4217","Type":"ContainerStarted","Data":"1ed7909ededd72858ed5506c0007d6fe3b46582446143ffe8944352b8ba4d851"} Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.750864 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nsvkq" event={"ID":"711742b9-8c03-4234-ae1d-4d7d3baa4217","Type":"ContainerStarted","Data":"0444ebca1a010dfd0c12422c7a75249dd309aa917657adbcdcb4636a31911704"} Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.774245 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nsvkq" podStartSLOduration=147.774205472 podStartE2EDuration="2m27.774205472s" podCreationTimestamp="2025-12-04 15:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:46.770780028 +0000 UTC m=+174.205449895" watchObservedRunningTime="2025-12-04 15:22:46.774205472 +0000 UTC m=+174.208875329" Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.824858 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:46 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:46 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:46 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:46 crc kubenswrapper[4676]: I1204 15:22:46.824946 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:47 crc kubenswrapper[4676]: I1204 15:22:47.427442 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 04 15:22:47 crc kubenswrapper[4676]: W1204 15:22:47.519639 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5333def7_d665_44a4_881b_c86b0be58352.slice/crio-e00cf9b4842d00fe7e3f57b43d0d5d05594a60da8756dc8b02ed0d27058fd181 WatchSource:0}: Error finding container e00cf9b4842d00fe7e3f57b43d0d5d05594a60da8756dc8b02ed0d27058fd181: Status 404 returned error can't find the container with id e00cf9b4842d00fe7e3f57b43d0d5d05594a60da8756dc8b02ed0d27058fd181 Dec 04 15:22:47 crc kubenswrapper[4676]: I1204 15:22:47.761655 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5333def7-d665-44a4-881b-c86b0be58352","Type":"ContainerStarted","Data":"e00cf9b4842d00fe7e3f57b43d0d5d05594a60da8756dc8b02ed0d27058fd181"} Dec 04 15:22:47 crc kubenswrapper[4676]: I1204 15:22:47.818670 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:47 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:47 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:47 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:47 crc kubenswrapper[4676]: I1204 15:22:47.818781 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:48 crc kubenswrapper[4676]: I1204 15:22:48.819191 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:48 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:48 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:48 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:48 crc kubenswrapper[4676]: I1204 15:22:48.819529 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:49 crc kubenswrapper[4676]: I1204 15:22:49.819242 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:49 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:49 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:49 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:49 crc kubenswrapper[4676]: I1204 15:22:49.819382 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:50 crc kubenswrapper[4676]: I1204 15:22:50.819594 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:50 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:50 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:50 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:50 crc kubenswrapper[4676]: I1204 15:22:50.819839 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:51 crc kubenswrapper[4676]: I1204 15:22:51.078238 4676 patch_prober.go:28] interesting pod/console-f9d7485db-mtj84 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 04 15:22:51 crc kubenswrapper[4676]: I1204 15:22:51.078311 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mtj84" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 04 15:22:51 crc kubenswrapper[4676]: I1204 15:22:51.135525 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-qbw9s" Dec 04 15:22:51 crc kubenswrapper[4676]: I1204 15:22:51.818619 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:51 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:51 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:51 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:51 crc kubenswrapper[4676]: I1204 15:22:51.818694 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:52 crc kubenswrapper[4676]: I1204 15:22:52.818527 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:52 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:52 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:52 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:52 crc kubenswrapper[4676]: I1204 15:22:52.818609 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:53 crc kubenswrapper[4676]: I1204 15:22:53.819150 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:53 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:53 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:53 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:53 crc kubenswrapper[4676]: I1204 15:22:53.819691 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:54 crc kubenswrapper[4676]: I1204 15:22:54.820028 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:54 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:54 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:54 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:54 crc kubenswrapper[4676]: I1204 15:22:54.820392 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:55 crc kubenswrapper[4676]: I1204 15:22:55.819384 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:55 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:55 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:55 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:55 crc kubenswrapper[4676]: I1204 15:22:55.819456 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:56 crc kubenswrapper[4676]: I1204 15:22:56.818436 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:56 crc kubenswrapper[4676]: [-]has-synced failed: reason withheld Dec 04 15:22:56 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:56 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:56 crc kubenswrapper[4676]: I1204 15:22:56.818536 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:57 crc kubenswrapper[4676]: I1204 15:22:57.825592 4676 patch_prober.go:28] interesting pod/router-default-5444994796-nrpqk container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 04 15:22:57 crc kubenswrapper[4676]: [+]has-synced ok Dec 04 15:22:57 crc kubenswrapper[4676]: [+]process-running ok Dec 04 15:22:57 crc kubenswrapper[4676]: healthz check failed Dec 04 15:22:57 crc kubenswrapper[4676]: I1204 15:22:57.826003 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-nrpqk" podUID="6f91c5fa-e347-44f5-8229-cdaa1db9b7a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:22:58 crc kubenswrapper[4676]: I1204 15:22:58.819404 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:58 crc kubenswrapper[4676]: I1204 15:22:58.821783 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-nrpqk" Dec 04 15:22:58 crc kubenswrapper[4676]: I1204 15:22:58.849622 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5333def7-d665-44a4-881b-c86b0be58352","Type":"ContainerStarted","Data":"ede12dd96594d154566dfe8db008959cb67b674cb864f2b5852c5cefab976af5"} Dec 04 15:22:59 crc kubenswrapper[4676]: I1204 15:22:59.871251 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=13.871193005 podStartE2EDuration="13.871193005s" podCreationTimestamp="2025-12-04 15:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:22:59.868455369 +0000 UTC m=+187.303125216" watchObservedRunningTime="2025-12-04 15:22:59.871193005 +0000 UTC m=+187.305862862" Dec 04 15:23:00 crc kubenswrapper[4676]: I1204 15:23:00.860731 4676 generic.go:334] "Generic (PLEG): container finished" podID="5333def7-d665-44a4-881b-c86b0be58352" containerID="ede12dd96594d154566dfe8db008959cb67b674cb864f2b5852c5cefab976af5" exitCode=0 Dec 04 15:23:00 crc kubenswrapper[4676]: I1204 15:23:00.860785 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5333def7-d665-44a4-881b-c86b0be58352","Type":"ContainerDied","Data":"ede12dd96594d154566dfe8db008959cb67b674cb864f2b5852c5cefab976af5"} Dec 04 15:23:01 crc kubenswrapper[4676]: I1204 15:23:01.078248 4676 patch_prober.go:28] interesting pod/console-f9d7485db-mtj84 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 04 15:23:01 crc kubenswrapper[4676]: I1204 15:23:01.078349 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-mtj84" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.021336 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.319900 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.461158 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333def7-d665-44a4-881b-c86b0be58352-kube-api-access\") pod \"5333def7-d665-44a4-881b-c86b0be58352\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.461262 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5333def7-d665-44a4-881b-c86b0be58352-kubelet-dir\") pod \"5333def7-d665-44a4-881b-c86b0be58352\" (UID: \"5333def7-d665-44a4-881b-c86b0be58352\") " Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.461459 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5333def7-d665-44a4-881b-c86b0be58352-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5333def7-d665-44a4-881b-c86b0be58352" (UID: "5333def7-d665-44a4-881b-c86b0be58352"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.462573 4676 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5333def7-d665-44a4-881b-c86b0be58352-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.468192 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5333def7-d665-44a4-881b-c86b0be58352-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5333def7-d665-44a4-881b-c86b0be58352" (UID: "5333def7-d665-44a4-881b-c86b0be58352"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.564384 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5333def7-d665-44a4-881b-c86b0be58352-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.871868 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5333def7-d665-44a4-881b-c86b0be58352","Type":"ContainerDied","Data":"e00cf9b4842d00fe7e3f57b43d0d5d05594a60da8756dc8b02ed0d27058fd181"} Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.871942 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00cf9b4842d00fe7e3f57b43d0d5d05594a60da8756dc8b02ed0d27058fd181" Dec 04 15:23:02 crc kubenswrapper[4676]: I1204 15:23:02.871983 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 04 15:23:11 crc kubenswrapper[4676]: I1204 15:23:11.082054 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:23:11 crc kubenswrapper[4676]: I1204 15:23:11.087006 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:23:11 crc kubenswrapper[4676]: I1204 15:23:11.161004 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-g7j5k" Dec 04 15:23:16 crc kubenswrapper[4676]: I1204 15:23:16.026789 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:23:16 crc kubenswrapper[4676]: I1204 15:23:16.027139 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:23:17 crc kubenswrapper[4676]: E1204 15:23:17.482278 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 15:23:17 crc kubenswrapper[4676]: E1204 15:23:17.482706 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fzbml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-srs6p_openshift-marketplace(8e48d278-595d-4cee-a3c7-ca1cf46a2184): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:23:17 crc kubenswrapper[4676]: E1204 15:23:17.484010 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-srs6p" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.583917 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 15:23:19 crc kubenswrapper[4676]: E1204 15:23:19.584578 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5333def7-d665-44a4-881b-c86b0be58352" containerName="pruner" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.584611 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5333def7-d665-44a4-881b-c86b0be58352" containerName="pruner" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.584829 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5333def7-d665-44a4-881b-c86b0be58352" containerName="pruner" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.585457 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.587930 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.590759 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.599249 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.700810 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.700870 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.802782 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.802994 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.803007 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.823991 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:19 crc kubenswrapper[4676]: I1204 15:23:19.904265 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.186791 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.188168 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.192354 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.380175 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kube-api-access\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.380649 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.380776 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-var-lock\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.481695 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kube-api-access\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.481777 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.481822 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-var-lock\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.481937 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-var-lock\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.481975 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kubelet-dir\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.501365 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kube-api-access\") pod \"installer-9-crc\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:25 crc kubenswrapper[4676]: I1204 15:23:25.516592 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:23:27 crc kubenswrapper[4676]: E1204 15:23:27.351069 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-srs6p" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" Dec 04 15:23:27 crc kubenswrapper[4676]: E1204 15:23:27.371320 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 04 15:23:27 crc kubenswrapper[4676]: E1204 15:23:27.371484 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-424g5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jnsft_openshift-marketplace(2b3eb3b7-9f03-46b3-890d-27429ead00a7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:23:27 crc kubenswrapper[4676]: E1204 15:23:27.372643 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jnsft" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" Dec 04 15:23:27 crc kubenswrapper[4676]: E1204 15:23:27.482075 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 15:23:27 crc kubenswrapper[4676]: E1204 15:23:27.482296 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bw5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2brr7_openshift-marketplace(131c312c-f19d-4e87-8f86-8d38926b2d87): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:23:27 crc kubenswrapper[4676]: E1204 15:23:27.483506 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2brr7" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.336694 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jnsft" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.337329 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2brr7" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.811143 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.811743 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7llgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-ml7rm_openshift-marketplace(a945f156-c10a-4132-8fb4-e43040790a01): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.813028 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-ml7rm" podUID="a945f156-c10a-4132-8fb4-e43040790a01" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.878793 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.879126 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wbc79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-zd784_openshift-marketplace(009171f0-c033-4ea6-b46d-0155fe9f3e71): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:23:30 crc kubenswrapper[4676]: E1204 15:23:30.880330 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-zd784" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" Dec 04 15:23:31 crc kubenswrapper[4676]: I1204 15:23:31.009792 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx6hs" event={"ID":"1aa95312-1f71-4167-9982-352d67b49f03","Type":"ContainerStarted","Data":"cff388542e0445c8cb02d77f4f3ebff3af17e3797020efa9358bae35a959e883"} Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.011582 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-zd784" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.011729 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-ml7rm" podUID="a945f156-c10a-4132-8fb4-e43040790a01" Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.028005 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.028191 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s4cw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dl6md_openshift-marketplace(69c815dc-b379-4325-90a5-2a86fc80b7e5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.029423 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dl6md" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.138834 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.139202 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jz4wl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7vsp8_openshift-marketplace(a656fb7b-4968-4459-a0fc-9fe6571ee582): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:23:31 crc kubenswrapper[4676]: E1204 15:23:31.140567 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7vsp8" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" Dec 04 15:23:31 crc kubenswrapper[4676]: I1204 15:23:31.151387 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 04 15:23:31 crc kubenswrapper[4676]: I1204 15:23:31.200798 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.016548 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89","Type":"ContainerStarted","Data":"9137264c6e0ec39aae04423932eddfaecb44a41fec269a67d5d9b9c5e2dff35a"} Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.020193 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89","Type":"ContainerStarted","Data":"fa5fc056d9f8ffd1f7706b146d639c82725c6700aceeb71824f943a9300937d1"} Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.020237 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx6hs" event={"ID":"1aa95312-1f71-4167-9982-352d67b49f03","Type":"ContainerDied","Data":"cff388542e0445c8cb02d77f4f3ebff3af17e3797020efa9358bae35a959e883"} Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.019569 4676 generic.go:334] "Generic (PLEG): container finished" podID="1aa95312-1f71-4167-9982-352d67b49f03" containerID="cff388542e0445c8cb02d77f4f3ebff3af17e3797020efa9358bae35a959e883" exitCode=0 Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.021978 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b378ae9f-e6e9-4e71-8fb4-56d6239599eb","Type":"ContainerStarted","Data":"820f10ad00e9750187ce7ae5b48c49f6ca7f46efa860f86ac7ffdad0eb58cd97"} Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.022139 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b378ae9f-e6e9-4e71-8fb4-56d6239599eb","Type":"ContainerStarted","Data":"aaea9132cc8de043236e20f0d19f5f59cc9f16c17dde2ee8cfb1d4a657d240bc"} Dec 04 15:23:32 crc kubenswrapper[4676]: E1204 15:23:32.023417 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7vsp8" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" Dec 04 15:23:32 crc kubenswrapper[4676]: E1204 15:23:32.024083 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dl6md" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.050347 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=13.050263803 podStartE2EDuration="13.050263803s" podCreationTimestamp="2025-12-04 15:23:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:23:32.030543132 +0000 UTC m=+219.465212989" watchObservedRunningTime="2025-12-04 15:23:32.050263803 +0000 UTC m=+219.484933660" Dec 04 15:23:32 crc kubenswrapper[4676]: I1204 15:23:32.092145 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.092116743 podStartE2EDuration="7.092116743s" podCreationTimestamp="2025-12-04 15:23:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:23:32.092118693 +0000 UTC m=+219.526788560" watchObservedRunningTime="2025-12-04 15:23:32.092116743 +0000 UTC m=+219.526786600" Dec 04 15:23:33 crc kubenswrapper[4676]: I1204 15:23:33.028678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx6hs" event={"ID":"1aa95312-1f71-4167-9982-352d67b49f03","Type":"ContainerStarted","Data":"776decc324e4c9692ffb753724e2d065ae05bbf134d25d0e9d8887210b226df0"} Dec 04 15:23:33 crc kubenswrapper[4676]: I1204 15:23:33.030555 4676 generic.go:334] "Generic (PLEG): container finished" podID="7a849b6a-9443-4a40-8d5b-d31ad0b7dd89" containerID="9137264c6e0ec39aae04423932eddfaecb44a41fec269a67d5d9b9c5e2dff35a" exitCode=0 Dec 04 15:23:33 crc kubenswrapper[4676]: I1204 15:23:33.030976 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89","Type":"ContainerDied","Data":"9137264c6e0ec39aae04423932eddfaecb44a41fec269a67d5d9b9c5e2dff35a"} Dec 04 15:23:33 crc kubenswrapper[4676]: I1204 15:23:33.050538 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tx6hs" podStartSLOduration=3.19581889 podStartE2EDuration="51.050518899s" podCreationTimestamp="2025-12-04 15:22:42 +0000 UTC" firstStartedPulling="2025-12-04 15:22:44.650057577 +0000 UTC m=+172.084727434" lastFinishedPulling="2025-12-04 15:23:32.504757586 +0000 UTC m=+219.939427443" observedRunningTime="2025-12-04 15:23:33.047603898 +0000 UTC m=+220.482273755" watchObservedRunningTime="2025-12-04 15:23:33.050518899 +0000 UTC m=+220.485188756" Dec 04 15:23:33 crc kubenswrapper[4676]: I1204 15:23:33.298080 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:23:33 crc kubenswrapper[4676]: I1204 15:23:33.298133 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.283692 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.356114 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tx6hs" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="registry-server" probeResult="failure" output=< Dec 04 15:23:34 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 15:23:34 crc kubenswrapper[4676]: > Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.405502 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kube-api-access\") pod \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.405696 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kubelet-dir\") pod \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\" (UID: \"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89\") " Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.405951 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7a849b6a-9443-4a40-8d5b-d31ad0b7dd89" (UID: "7a849b6a-9443-4a40-8d5b-d31ad0b7dd89"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.411217 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7a849b6a-9443-4a40-8d5b-d31ad0b7dd89" (UID: "7a849b6a-9443-4a40-8d5b-d31ad0b7dd89"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.507256 4676 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:23:34 crc kubenswrapper[4676]: I1204 15:23:34.507306 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a849b6a-9443-4a40-8d5b-d31ad0b7dd89-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:23:35 crc kubenswrapper[4676]: I1204 15:23:35.047395 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 04 15:23:35 crc kubenswrapper[4676]: I1204 15:23:35.047360 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7a849b6a-9443-4a40-8d5b-d31ad0b7dd89","Type":"ContainerDied","Data":"fa5fc056d9f8ffd1f7706b146d639c82725c6700aceeb71824f943a9300937d1"} Dec 04 15:23:35 crc kubenswrapper[4676]: I1204 15:23:35.047740 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5fc056d9f8ffd1f7706b146d639c82725c6700aceeb71824f943a9300937d1" Dec 04 15:23:38 crc kubenswrapper[4676]: I1204 15:23:38.400755 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-675c2"] Dec 04 15:23:40 crc kubenswrapper[4676]: I1204 15:23:40.080891 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srs6p" event={"ID":"8e48d278-595d-4cee-a3c7-ca1cf46a2184","Type":"ContainerStarted","Data":"46fe923414546ca46d8283430514c444869e466a75895d212388b7487b14fa04"} Dec 04 15:23:41 crc kubenswrapper[4676]: I1204 15:23:41.089395 4676 generic.go:334] "Generic (PLEG): container finished" podID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerID="46fe923414546ca46d8283430514c444869e466a75895d212388b7487b14fa04" exitCode=0 Dec 04 15:23:41 crc kubenswrapper[4676]: I1204 15:23:41.089457 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srs6p" event={"ID":"8e48d278-595d-4cee-a3c7-ca1cf46a2184","Type":"ContainerDied","Data":"46fe923414546ca46d8283430514c444869e466a75895d212388b7487b14fa04"} Dec 04 15:23:42 crc kubenswrapper[4676]: I1204 15:23:42.098254 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srs6p" event={"ID":"8e48d278-595d-4cee-a3c7-ca1cf46a2184","Type":"ContainerStarted","Data":"0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e"} Dec 04 15:23:43 crc kubenswrapper[4676]: I1204 15:23:43.132831 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-srs6p" podStartSLOduration=3.070140544 podStartE2EDuration="1m3.132774466s" podCreationTimestamp="2025-12-04 15:22:40 +0000 UTC" firstStartedPulling="2025-12-04 15:22:41.506693227 +0000 UTC m=+168.941363084" lastFinishedPulling="2025-12-04 15:23:41.569327149 +0000 UTC m=+229.003997006" observedRunningTime="2025-12-04 15:23:43.12181966 +0000 UTC m=+230.556489517" watchObservedRunningTime="2025-12-04 15:23:43.132774466 +0000 UTC m=+230.567444323" Dec 04 15:23:43 crc kubenswrapper[4676]: I1204 15:23:43.657605 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:23:43 crc kubenswrapper[4676]: I1204 15:23:43.704793 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:23:45 crc kubenswrapper[4676]: I1204 15:23:45.117251 4676 generic.go:334] "Generic (PLEG): container finished" podID="a945f156-c10a-4132-8fb4-e43040790a01" containerID="cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1" exitCode=0 Dec 04 15:23:45 crc kubenswrapper[4676]: I1204 15:23:45.117339 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml7rm" event={"ID":"a945f156-c10a-4132-8fb4-e43040790a01","Type":"ContainerDied","Data":"cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1"} Dec 04 15:23:45 crc kubenswrapper[4676]: I1204 15:23:45.120661 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2brr7" event={"ID":"131c312c-f19d-4e87-8f86-8d38926b2d87","Type":"ContainerStarted","Data":"9e351689f2d75d62635efced3236a21ddc32897aa607954647fdad7d26cb2408"} Dec 04 15:23:45 crc kubenswrapper[4676]: I1204 15:23:45.123371 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnsft" event={"ID":"2b3eb3b7-9f03-46b3-890d-27429ead00a7","Type":"ContainerStarted","Data":"8ac348fda79b237d6b5edb0c52832a1da7ee9b5db48d3403a44ed0e0f9653ab5"} Dec 04 15:23:45 crc kubenswrapper[4676]: I1204 15:23:45.126216 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vsp8" event={"ID":"a656fb7b-4968-4459-a0fc-9fe6571ee582","Type":"ContainerStarted","Data":"0f71e7f525a1d95d7771fa23d45c7e8a9fe8f14c4c87139acca0f2e2ea7bf73c"} Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.027206 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.027850 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.027978 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.028612 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.028801 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8" gracePeriod=600 Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.132671 4676 generic.go:334] "Generic (PLEG): container finished" podID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerID="0f71e7f525a1d95d7771fa23d45c7e8a9fe8f14c4c87139acca0f2e2ea7bf73c" exitCode=0 Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.132719 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vsp8" event={"ID":"a656fb7b-4968-4459-a0fc-9fe6571ee582","Type":"ContainerDied","Data":"0f71e7f525a1d95d7771fa23d45c7e8a9fe8f14c4c87139acca0f2e2ea7bf73c"} Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.135394 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml7rm" event={"ID":"a945f156-c10a-4132-8fb4-e43040790a01","Type":"ContainerStarted","Data":"d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77"} Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.138347 4676 generic.go:334] "Generic (PLEG): container finished" podID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerID="9e351689f2d75d62635efced3236a21ddc32897aa607954647fdad7d26cb2408" exitCode=0 Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.138403 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2brr7" event={"ID":"131c312c-f19d-4e87-8f86-8d38926b2d87","Type":"ContainerDied","Data":"9e351689f2d75d62635efced3236a21ddc32897aa607954647fdad7d26cb2408"} Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.140744 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl6md" event={"ID":"69c815dc-b379-4325-90a5-2a86fc80b7e5","Type":"ContainerStarted","Data":"46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb"} Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.142601 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerID="8ac348fda79b237d6b5edb0c52832a1da7ee9b5db48d3403a44ed0e0f9653ab5" exitCode=0 Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.142628 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnsft" event={"ID":"2b3eb3b7-9f03-46b3-890d-27429ead00a7","Type":"ContainerDied","Data":"8ac348fda79b237d6b5edb0c52832a1da7ee9b5db48d3403a44ed0e0f9653ab5"} Dec 04 15:23:46 crc kubenswrapper[4676]: I1204 15:23:46.222261 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ml7rm" podStartSLOduration=4.115382118 podStartE2EDuration="1m7.222243133s" podCreationTimestamp="2025-12-04 15:22:39 +0000 UTC" firstStartedPulling="2025-12-04 15:22:42.522256101 +0000 UTC m=+169.956925958" lastFinishedPulling="2025-12-04 15:23:45.629117116 +0000 UTC m=+233.063786973" observedRunningTime="2025-12-04 15:23:46.218976792 +0000 UTC m=+233.653646659" watchObservedRunningTime="2025-12-04 15:23:46.222243133 +0000 UTC m=+233.656912980" Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.151996 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2brr7" event={"ID":"131c312c-f19d-4e87-8f86-8d38926b2d87","Type":"ContainerStarted","Data":"88443358f044707098a6957bb486390af8265efb28696463f962f7bd7cffa00b"} Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.155849 4676 generic.go:334] "Generic (PLEG): container finished" podID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerID="46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb" exitCode=0 Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.155956 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl6md" event={"ID":"69c815dc-b379-4325-90a5-2a86fc80b7e5","Type":"ContainerDied","Data":"46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb"} Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.158568 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8" exitCode=0 Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.158635 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8"} Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.158679 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"6e26c970b5d7b969724e4eca8ef33d05c52608b0c4a173cc79a32e81b4de40c2"} Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.162076 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnsft" event={"ID":"2b3eb3b7-9f03-46b3-890d-27429ead00a7","Type":"ContainerStarted","Data":"3c57c346d41caa9987d913b1c31d1f0a16fd9e1d3614d8698f47553da0d772cc"} Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.165057 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vsp8" event={"ID":"a656fb7b-4968-4459-a0fc-9fe6571ee582","Type":"ContainerStarted","Data":"6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45"} Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.167030 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd784" event={"ID":"009171f0-c033-4ea6-b46d-0155fe9f3e71","Type":"ContainerStarted","Data":"ab4fd5783279cead129241dceb4ace52911cf8e650645bb02fe3dff611031b37"} Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.176527 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2brr7" podStartSLOduration=3.244215207 podStartE2EDuration="1m6.176498144s" podCreationTimestamp="2025-12-04 15:22:41 +0000 UTC" firstStartedPulling="2025-12-04 15:22:43.629342912 +0000 UTC m=+171.064012769" lastFinishedPulling="2025-12-04 15:23:46.561625849 +0000 UTC m=+233.996295706" observedRunningTime="2025-12-04 15:23:47.174059086 +0000 UTC m=+234.608728963" watchObservedRunningTime="2025-12-04 15:23:47.176498144 +0000 UTC m=+234.611168001" Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.256942 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7vsp8" podStartSLOduration=2.242725811 podStartE2EDuration="1m5.256897411s" podCreationTimestamp="2025-12-04 15:22:42 +0000 UTC" firstStartedPulling="2025-12-04 15:22:43.61978737 +0000 UTC m=+171.054457227" lastFinishedPulling="2025-12-04 15:23:46.63395896 +0000 UTC m=+234.068628827" observedRunningTime="2025-12-04 15:23:47.255462611 +0000 UTC m=+234.690132468" watchObservedRunningTime="2025-12-04 15:23:47.256897411 +0000 UTC m=+234.691567268" Dec 04 15:23:47 crc kubenswrapper[4676]: I1204 15:23:47.278200 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jnsft" podStartSLOduration=3.488961389 podStartE2EDuration="1m4.278174086s" podCreationTimestamp="2025-12-04 15:22:43 +0000 UTC" firstStartedPulling="2025-12-04 15:22:45.726769556 +0000 UTC m=+173.161439413" lastFinishedPulling="2025-12-04 15:23:46.515982253 +0000 UTC m=+233.950652110" observedRunningTime="2025-12-04 15:23:47.276651633 +0000 UTC m=+234.711321520" watchObservedRunningTime="2025-12-04 15:23:47.278174086 +0000 UTC m=+234.712843943" Dec 04 15:23:48 crc kubenswrapper[4676]: I1204 15:23:48.175514 4676 generic.go:334] "Generic (PLEG): container finished" podID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerID="ab4fd5783279cead129241dceb4ace52911cf8e650645bb02fe3dff611031b37" exitCode=0 Dec 04 15:23:48 crc kubenswrapper[4676]: I1204 15:23:48.175878 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd784" event={"ID":"009171f0-c033-4ea6-b46d-0155fe9f3e71","Type":"ContainerDied","Data":"ab4fd5783279cead129241dceb4ace52911cf8e650645bb02fe3dff611031b37"} Dec 04 15:23:48 crc kubenswrapper[4676]: I1204 15:23:48.179498 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl6md" event={"ID":"69c815dc-b379-4325-90a5-2a86fc80b7e5","Type":"ContainerStarted","Data":"fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46"} Dec 04 15:23:49 crc kubenswrapper[4676]: I1204 15:23:49.188781 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd784" event={"ID":"009171f0-c033-4ea6-b46d-0155fe9f3e71","Type":"ContainerStarted","Data":"f6f46434e5c90eea329a9cbe61386e51ae33c5cd53ef7a27a95fa3359b018b16"} Dec 04 15:23:49 crc kubenswrapper[4676]: I1204 15:23:49.209137 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zd784" podStartSLOduration=2.833774482 podStartE2EDuration="1m10.209114893s" podCreationTimestamp="2025-12-04 15:22:39 +0000 UTC" firstStartedPulling="2025-12-04 15:22:41.501397892 +0000 UTC m=+168.936067749" lastFinishedPulling="2025-12-04 15:23:48.876738303 +0000 UTC m=+236.311408160" observedRunningTime="2025-12-04 15:23:49.207167368 +0000 UTC m=+236.641837245" watchObservedRunningTime="2025-12-04 15:23:49.209114893 +0000 UTC m=+236.643784750" Dec 04 15:23:49 crc kubenswrapper[4676]: I1204 15:23:49.211718 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dl6md" podStartSLOduration=4.19633944 podStartE2EDuration="1m9.211709705s" podCreationTimestamp="2025-12-04 15:22:40 +0000 UTC" firstStartedPulling="2025-12-04 15:22:42.522319643 +0000 UTC m=+169.956989500" lastFinishedPulling="2025-12-04 15:23:47.537689908 +0000 UTC m=+234.972359765" observedRunningTime="2025-12-04 15:23:48.229248906 +0000 UTC m=+235.663918783" watchObservedRunningTime="2025-12-04 15:23:49.211709705 +0000 UTC m=+236.646379562" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.395036 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.395278 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.427062 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.427114 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.462435 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.513090 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.513489 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.549232 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.880633 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.976490 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:23:50 crc kubenswrapper[4676]: I1204 15:23:50.976586 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:23:51 crc kubenswrapper[4676]: I1204 15:23:51.023023 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:23:51 crc kubenswrapper[4676]: I1204 15:23:51.242445 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:23:51 crc kubenswrapper[4676]: I1204 15:23:51.247524 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:23:52 crc kubenswrapper[4676]: I1204 15:23:52.040338 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:23:52 crc kubenswrapper[4676]: I1204 15:23:52.040679 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:23:52 crc kubenswrapper[4676]: I1204 15:23:52.083881 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:23:52 crc kubenswrapper[4676]: I1204 15:23:52.238301 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:23:52 crc kubenswrapper[4676]: I1204 15:23:52.617554 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:23:52 crc kubenswrapper[4676]: I1204 15:23:52.617942 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:23:52 crc kubenswrapper[4676]: I1204 15:23:52.656574 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:23:53 crc kubenswrapper[4676]: I1204 15:23:53.248238 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:23:53 crc kubenswrapper[4676]: I1204 15:23:53.652706 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:23:53 crc kubenswrapper[4676]: I1204 15:23:53.652765 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:23:53 crc kubenswrapper[4676]: I1204 15:23:53.689534 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:23:54 crc kubenswrapper[4676]: I1204 15:23:54.265442 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:23:55 crc kubenswrapper[4676]: I1204 15:23:55.118596 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srs6p"] Dec 04 15:23:55 crc kubenswrapper[4676]: I1204 15:23:55.119147 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-srs6p" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="registry-server" containerID="cri-o://0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e" gracePeriod=2 Dec 04 15:23:55 crc kubenswrapper[4676]: I1204 15:23:55.720935 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vsp8"] Dec 04 15:23:56 crc kubenswrapper[4676]: I1204 15:23:56.223837 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7vsp8" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="registry-server" containerID="cri-o://6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45" gracePeriod=2 Dec 04 15:23:57 crc kubenswrapper[4676]: I1204 15:23:57.521735 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jnsft"] Dec 04 15:23:57 crc kubenswrapper[4676]: I1204 15:23:57.522243 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jnsft" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="registry-server" containerID="cri-o://3c57c346d41caa9987d913b1c31d1f0a16fd9e1d3614d8698f47553da0d772cc" gracePeriod=2 Dec 04 15:23:58 crc kubenswrapper[4676]: I1204 15:23:58.243530 4676 generic.go:334] "Generic (PLEG): container finished" podID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerID="0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e" exitCode=0 Dec 04 15:23:58 crc kubenswrapper[4676]: I1204 15:23:58.243586 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srs6p" event={"ID":"8e48d278-595d-4cee-a3c7-ca1cf46a2184","Type":"ContainerDied","Data":"0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e"} Dec 04 15:24:00 crc kubenswrapper[4676]: I1204 15:24:00.260747 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnsft" event={"ID":"2b3eb3b7-9f03-46b3-890d-27429ead00a7","Type":"ContainerDied","Data":"3c57c346d41caa9987d913b1c31d1f0a16fd9e1d3614d8698f47553da0d772cc"} Dec 04 15:24:00 crc kubenswrapper[4676]: I1204 15:24:00.260711 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerID="3c57c346d41caa9987d913b1c31d1f0a16fd9e1d3614d8698f47553da0d772cc" exitCode=0 Dec 04 15:24:00 crc kubenswrapper[4676]: I1204 15:24:00.433953 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:24:00 crc kubenswrapper[4676]: E1204 15:24:00.514147 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e is running failed: container process not found" containerID="0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 15:24:00 crc kubenswrapper[4676]: E1204 15:24:00.514731 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e is running failed: container process not found" containerID="0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 15:24:00 crc kubenswrapper[4676]: E1204 15:24:00.515252 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e is running failed: container process not found" containerID="0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 15:24:00 crc kubenswrapper[4676]: E1204 15:24:00.515332 4676 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-srs6p" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="registry-server" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.018347 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.269190 4676 generic.go:334] "Generic (PLEG): container finished" podID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerID="6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45" exitCode=0 Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.269253 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vsp8" event={"ID":"a656fb7b-4968-4459-a0fc-9fe6571ee582","Type":"ContainerDied","Data":"6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45"} Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.613310 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.735359 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbml\" (UniqueName: \"kubernetes.io/projected/8e48d278-595d-4cee-a3c7-ca1cf46a2184-kube-api-access-fzbml\") pod \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.735507 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-utilities\") pod \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.735602 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-catalog-content\") pod \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\" (UID: \"8e48d278-595d-4cee-a3c7-ca1cf46a2184\") " Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.736476 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-utilities" (OuterVolumeSpecName: "utilities") pod "8e48d278-595d-4cee-a3c7-ca1cf46a2184" (UID: "8e48d278-595d-4cee-a3c7-ca1cf46a2184"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.742260 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e48d278-595d-4cee-a3c7-ca1cf46a2184-kube-api-access-fzbml" (OuterVolumeSpecName: "kube-api-access-fzbml") pod "8e48d278-595d-4cee-a3c7-ca1cf46a2184" (UID: "8e48d278-595d-4cee-a3c7-ca1cf46a2184"). InnerVolumeSpecName "kube-api-access-fzbml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.782977 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e48d278-595d-4cee-a3c7-ca1cf46a2184" (UID: "8e48d278-595d-4cee-a3c7-ca1cf46a2184"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.837437 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.837486 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbml\" (UniqueName: \"kubernetes.io/projected/8e48d278-595d-4cee-a3c7-ca1cf46a2184-kube-api-access-fzbml\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:01 crc kubenswrapper[4676]: I1204 15:24:01.837500 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e48d278-595d-4cee-a3c7-ca1cf46a2184-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.276493 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-srs6p" event={"ID":"8e48d278-595d-4cee-a3c7-ca1cf46a2184","Type":"ContainerDied","Data":"ad87eba11cf6c2683ed454a4d9ec5bd69265f69d77ee90e2bfd63af591a6c53b"} Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.276569 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-srs6p" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.276585 4676 scope.go:117] "RemoveContainer" containerID="0b64778d9d1e8a9b5fbd9fe833e08f873aae0e0253787b6951deea8443ea215e" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.292589 4676 scope.go:117] "RemoveContainer" containerID="46fe923414546ca46d8283430514c444869e466a75895d212388b7487b14fa04" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.308097 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-srs6p"] Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.312284 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-srs6p"] Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.330288 4676 scope.go:117] "RemoveContainer" containerID="5e7e30e6a19597352591a28e6edb18f1a96027eddaaa321a7b0f4bed95f67503" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.531696 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dl6md"] Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.537991 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dl6md" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="registry-server" containerID="cri-o://fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46" gracePeriod=2 Dec 04 15:24:02 crc kubenswrapper[4676]: E1204 15:24:02.617887 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45 is running failed: container process not found" containerID="6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 15:24:02 crc kubenswrapper[4676]: E1204 15:24:02.618802 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45 is running failed: container process not found" containerID="6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 15:24:02 crc kubenswrapper[4676]: E1204 15:24:02.619495 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45 is running failed: container process not found" containerID="6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45" cmd=["grpc_health_probe","-addr=:50051"] Dec 04 15:24:02 crc kubenswrapper[4676]: E1204 15:24:02.619546 4676 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-7vsp8" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="registry-server" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.653517 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.674438 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.749652 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-424g5\" (UniqueName: \"kubernetes.io/projected/2b3eb3b7-9f03-46b3-890d-27429ead00a7-kube-api-access-424g5\") pod \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.749737 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-catalog-content\") pod \"a656fb7b-4968-4459-a0fc-9fe6571ee582\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.749797 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz4wl\" (UniqueName: \"kubernetes.io/projected/a656fb7b-4968-4459-a0fc-9fe6571ee582-kube-api-access-jz4wl\") pod \"a656fb7b-4968-4459-a0fc-9fe6571ee582\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.749826 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-utilities\") pod \"a656fb7b-4968-4459-a0fc-9fe6571ee582\" (UID: \"a656fb7b-4968-4459-a0fc-9fe6571ee582\") " Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.749921 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-catalog-content\") pod \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.749974 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-utilities\") pod \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\" (UID: \"2b3eb3b7-9f03-46b3-890d-27429ead00a7\") " Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.750834 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-utilities" (OuterVolumeSpecName: "utilities") pod "a656fb7b-4968-4459-a0fc-9fe6571ee582" (UID: "a656fb7b-4968-4459-a0fc-9fe6571ee582"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.750892 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-utilities" (OuterVolumeSpecName: "utilities") pod "2b3eb3b7-9f03-46b3-890d-27429ead00a7" (UID: "2b3eb3b7-9f03-46b3-890d-27429ead00a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.755834 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a656fb7b-4968-4459-a0fc-9fe6571ee582-kube-api-access-jz4wl" (OuterVolumeSpecName: "kube-api-access-jz4wl") pod "a656fb7b-4968-4459-a0fc-9fe6571ee582" (UID: "a656fb7b-4968-4459-a0fc-9fe6571ee582"). InnerVolumeSpecName "kube-api-access-jz4wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.756083 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b3eb3b7-9f03-46b3-890d-27429ead00a7-kube-api-access-424g5" (OuterVolumeSpecName: "kube-api-access-424g5") pod "2b3eb3b7-9f03-46b3-890d-27429ead00a7" (UID: "2b3eb3b7-9f03-46b3-890d-27429ead00a7"). InnerVolumeSpecName "kube-api-access-424g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.771772 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a656fb7b-4968-4459-a0fc-9fe6571ee582" (UID: "a656fb7b-4968-4459-a0fc-9fe6571ee582"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.851963 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.852022 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-424g5\" (UniqueName: \"kubernetes.io/projected/2b3eb3b7-9f03-46b3-890d-27429ead00a7-kube-api-access-424g5\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.852086 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.852102 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz4wl\" (UniqueName: \"kubernetes.io/projected/a656fb7b-4968-4459-a0fc-9fe6571ee582-kube-api-access-jz4wl\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:02 crc kubenswrapper[4676]: I1204 15:24:02.852114 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a656fb7b-4968-4459-a0fc-9fe6571ee582-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.286956 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnsft" event={"ID":"2b3eb3b7-9f03-46b3-890d-27429ead00a7","Type":"ContainerDied","Data":"5df3781b0ba7c3234857d068fa90688c1f5b3a1099020ed33519bf9271c45002"} Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.288038 4676 scope.go:117] "RemoveContainer" containerID="3c57c346d41caa9987d913b1c31d1f0a16fd9e1d3614d8698f47553da0d772cc" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.288582 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnsft" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.289146 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7vsp8" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.289145 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7vsp8" event={"ID":"a656fb7b-4968-4459-a0fc-9fe6571ee582","Type":"ContainerDied","Data":"f173ff21187ee57676f5fa36bac3b2721072cb773e02aaba937461a6e14f1b0e"} Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.303533 4676 scope.go:117] "RemoveContainer" containerID="8ac348fda79b237d6b5edb0c52832a1da7ee9b5db48d3403a44ed0e0f9653ab5" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.321682 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vsp8"] Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.324053 4676 scope.go:117] "RemoveContainer" containerID="4441381fae54264918b73263ba94e2cd1751b183fbbdd7d780f9a5ffe90325b4" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.331868 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7vsp8"] Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.339190 4676 scope.go:117] "RemoveContainer" containerID="6188d1325ad8aa558fbf5dcf433a31cfc388e324c4392360dc37d4da1f623b45" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.393888 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" path="/var/lib/kubelet/pods/8e48d278-595d-4cee-a3c7-ca1cf46a2184/volumes" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.395188 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" path="/var/lib/kubelet/pods/a656fb7b-4968-4459-a0fc-9fe6571ee582/volumes" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.438798 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" podUID="d35d3a3f-f614-45fa-a59a-e5cefa471321" containerName="oauth-openshift" containerID="cri-o://b5c62d7e3b199afb0b2bcb3eccdd6ff6cdf5e89ca004876db6b9ed13fc69a4d0" gracePeriod=15 Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.689989 4676 scope.go:117] "RemoveContainer" containerID="0f71e7f525a1d95d7771fa23d45c7e8a9fe8f14c4c87139acca0f2e2ea7bf73c" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.694636 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b3eb3b7-9f03-46b3-890d-27429ead00a7" (UID: "2b3eb3b7-9f03-46b3-890d-27429ead00a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.719945 4676 scope.go:117] "RemoveContainer" containerID="8f6b124a5f33a177645e479a8449c62b74d2b3379e936ddf189299bccc5dc731" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.768706 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b3eb3b7-9f03-46b3-890d-27429ead00a7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.914753 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jnsft"] Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.920464 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jnsft"] Dec 04 15:24:03 crc kubenswrapper[4676]: I1204 15:24:03.978596 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.072504 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-utilities\") pod \"69c815dc-b379-4325-90a5-2a86fc80b7e5\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.072873 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4cw9\" (UniqueName: \"kubernetes.io/projected/69c815dc-b379-4325-90a5-2a86fc80b7e5-kube-api-access-s4cw9\") pod \"69c815dc-b379-4325-90a5-2a86fc80b7e5\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.072987 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-catalog-content\") pod \"69c815dc-b379-4325-90a5-2a86fc80b7e5\" (UID: \"69c815dc-b379-4325-90a5-2a86fc80b7e5\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.074515 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-utilities" (OuterVolumeSpecName: "utilities") pod "69c815dc-b379-4325-90a5-2a86fc80b7e5" (UID: "69c815dc-b379-4325-90a5-2a86fc80b7e5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.076564 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69c815dc-b379-4325-90a5-2a86fc80b7e5-kube-api-access-s4cw9" (OuterVolumeSpecName: "kube-api-access-s4cw9") pod "69c815dc-b379-4325-90a5-2a86fc80b7e5" (UID: "69c815dc-b379-4325-90a5-2a86fc80b7e5"). InnerVolumeSpecName "kube-api-access-s4cw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.120388 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69c815dc-b379-4325-90a5-2a86fc80b7e5" (UID: "69c815dc-b379-4325-90a5-2a86fc80b7e5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.174604 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.174653 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4cw9\" (UniqueName: \"kubernetes.io/projected/69c815dc-b379-4325-90a5-2a86fc80b7e5-kube-api-access-s4cw9\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.174673 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69c815dc-b379-4325-90a5-2a86fc80b7e5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.301679 4676 generic.go:334] "Generic (PLEG): container finished" podID="d35d3a3f-f614-45fa-a59a-e5cefa471321" containerID="b5c62d7e3b199afb0b2bcb3eccdd6ff6cdf5e89ca004876db6b9ed13fc69a4d0" exitCode=0 Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.301768 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" event={"ID":"d35d3a3f-f614-45fa-a59a-e5cefa471321","Type":"ContainerDied","Data":"b5c62d7e3b199afb0b2bcb3eccdd6ff6cdf5e89ca004876db6b9ed13fc69a4d0"} Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.304513 4676 generic.go:334] "Generic (PLEG): container finished" podID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerID="fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46" exitCode=0 Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.304570 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl6md" event={"ID":"69c815dc-b379-4325-90a5-2a86fc80b7e5","Type":"ContainerDied","Data":"fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46"} Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.304624 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dl6md" event={"ID":"69c815dc-b379-4325-90a5-2a86fc80b7e5","Type":"ContainerDied","Data":"e4a76872b988c326f0d130d5df1270df978173c188f6e7776ad2f91c62201328"} Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.304654 4676 scope.go:117] "RemoveContainer" containerID="fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.304642 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dl6md" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.327039 4676 scope.go:117] "RemoveContainer" containerID="46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.343296 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dl6md"] Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.345575 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dl6md"] Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.356280 4676 scope.go:117] "RemoveContainer" containerID="2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.376062 4676 scope.go:117] "RemoveContainer" containerID="fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.376701 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46\": container with ID starting with fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46 not found: ID does not exist" containerID="fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.376797 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46"} err="failed to get container status \"fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46\": rpc error: code = NotFound desc = could not find container \"fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46\": container with ID starting with fe8f879e90fdf31a75aeb3f0bf268c52a55e2d643cc9ffc6add078b141f98b46 not found: ID does not exist" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.376832 4676 scope.go:117] "RemoveContainer" containerID="46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.377379 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb\": container with ID starting with 46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb not found: ID does not exist" containerID="46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.377433 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb"} err="failed to get container status \"46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb\": rpc error: code = NotFound desc = could not find container \"46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb\": container with ID starting with 46864debd7728cfe41bbe8634ce560accd19f9155e3a6da6e736b276f6ef69cb not found: ID does not exist" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.377488 4676 scope.go:117] "RemoveContainer" containerID="2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.378140 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0\": container with ID starting with 2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0 not found: ID does not exist" containerID="2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.378220 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0"} err="failed to get container status \"2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0\": rpc error: code = NotFound desc = could not find container \"2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0\": container with ID starting with 2b4e02a5a0304a6389df148f6d102c4f23246923d52acd571d254321887807b0 not found: ID does not exist" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.428828 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.478154 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-cliconfig\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.478209 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-dir\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.478252 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf598\" (UniqueName: \"kubernetes.io/projected/d35d3a3f-f614-45fa-a59a-e5cefa471321-kube-api-access-mf598\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.478272 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-login\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480114 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480194 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-router-certs\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480249 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-error\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480281 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-policies\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480306 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-serving-cert\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480348 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-trusted-ca-bundle\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480389 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-provider-selection\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480414 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-ocp-branding-template\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480432 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-idp-0-file-data\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480455 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-service-ca\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480472 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-session\") pod \"d35d3a3f-f614-45fa-a59a-e5cefa471321\" (UID: \"d35d3a3f-f614-45fa-a59a-e5cefa471321\") " Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.480703 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.481032 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.481054 4676 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.481656 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.482236 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.483964 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.484381 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.484414 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.484489 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35d3a3f-f614-45fa-a59a-e5cefa471321-kube-api-access-mf598" (OuterVolumeSpecName: "kube-api-access-mf598") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "kube-api-access-mf598". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.485085 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.486939 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.487206 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.488167 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.488843 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.489984 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d35d3a3f-f614-45fa-a59a-e5cefa471321" (UID: "d35d3a3f-f614-45fa-a59a-e5cefa471321"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582422 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf598\" (UniqueName: \"kubernetes.io/projected/d35d3a3f-f614-45fa-a59a-e5cefa471321-kube-api-access-mf598\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582459 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582470 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582479 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582494 4676 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582505 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582514 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582528 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582541 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582550 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582558 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.582597 4676 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d35d3a3f-f614-45fa-a59a-e5cefa471321-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.634705 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-f58fb8db6-2wvqm"] Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635115 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635154 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635174 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635183 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635194 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635202 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635216 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635225 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635245 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635253 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635267 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635274 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635284 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635292 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635302 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635310 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635322 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635330 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="extract-content" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635339 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35d3a3f-f614-45fa-a59a-e5cefa471321" containerName="oauth-openshift" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635346 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35d3a3f-f614-45fa-a59a-e5cefa471321" containerName="oauth-openshift" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635353 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635360 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635371 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635378 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="extract-utilities" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635386 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a849b6a-9443-4a40-8d5b-d31ad0b7dd89" containerName="pruner" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635394 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a849b6a-9443-4a40-8d5b-d31ad0b7dd89" containerName="pruner" Dec 04 15:24:04 crc kubenswrapper[4676]: E1204 15:24:04.635403 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635409 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635543 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635563 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635573 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a656fb7b-4968-4459-a0fc-9fe6571ee582" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635581 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a849b6a-9443-4a40-8d5b-d31ad0b7dd89" containerName="pruner" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635587 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e48d278-595d-4cee-a3c7-ca1cf46a2184" containerName="registry-server" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.635595 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35d3a3f-f614-45fa-a59a-e5cefa471321" containerName="oauth-openshift" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.636207 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.645163 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f58fb8db6-2wvqm"] Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684366 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684435 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684463 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-service-ca\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684487 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-audit-policies\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684651 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27z4q\" (UniqueName: \"kubernetes.io/projected/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-kube-api-access-27z4q\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684718 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684743 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684773 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-error\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.684827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.685007 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-audit-dir\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.685028 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.685048 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-router-certs\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.685125 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-login\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.685287 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-session\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786396 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-audit-dir\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786461 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786491 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-router-certs\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786551 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-login\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786582 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-session\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786622 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786646 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786665 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-service-ca\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786685 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-audit-policies\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786705 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27z4q\" (UniqueName: \"kubernetes.io/projected/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-kube-api-access-27z4q\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786725 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786742 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786761 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-error\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.786783 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.787135 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-audit-dir\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.788006 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-service-ca\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.788334 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-audit-policies\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.788558 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.788935 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-cliconfig\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.790377 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-router-certs\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.790770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.790844 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-serving-cert\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.791474 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-login\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.792202 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-error\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.792552 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.792826 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-session\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.793286 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.804539 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27z4q\" (UniqueName: \"kubernetes.io/projected/20f45de0-b2ef-4fd6-9cf8-b79a0c831563-kube-api-access-27z4q\") pod \"oauth-openshift-f58fb8db6-2wvqm\" (UID: \"20f45de0-b2ef-4fd6-9cf8-b79a0c831563\") " pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:04 crc kubenswrapper[4676]: I1204 15:24:04.955774 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.312767 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.312953 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-675c2" event={"ID":"d35d3a3f-f614-45fa-a59a-e5cefa471321","Type":"ContainerDied","Data":"ef5dfe9325db7a54c3641d571301202da5c6ddcc88855e02a3e6042b0e4ae03e"} Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.313010 4676 scope.go:117] "RemoveContainer" containerID="b5c62d7e3b199afb0b2bcb3eccdd6ff6cdf5e89ca004876db6b9ed13fc69a4d0" Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.354626 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-675c2"] Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.360989 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-675c2"] Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.363495 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-f58fb8db6-2wvqm"] Dec 04 15:24:05 crc kubenswrapper[4676]: W1204 15:24:05.367137 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f45de0_b2ef_4fd6_9cf8_b79a0c831563.slice/crio-cf3716c089a67af05e9a5b032299f57de6da7ce3f53db422beaf2c43fe71608a WatchSource:0}: Error finding container cf3716c089a67af05e9a5b032299f57de6da7ce3f53db422beaf2c43fe71608a: Status 404 returned error can't find the container with id cf3716c089a67af05e9a5b032299f57de6da7ce3f53db422beaf2c43fe71608a Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.391726 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b3eb3b7-9f03-46b3-890d-27429ead00a7" path="/var/lib/kubelet/pods/2b3eb3b7-9f03-46b3-890d-27429ead00a7/volumes" Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.392789 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69c815dc-b379-4325-90a5-2a86fc80b7e5" path="/var/lib/kubelet/pods/69c815dc-b379-4325-90a5-2a86fc80b7e5/volumes" Dec 04 15:24:05 crc kubenswrapper[4676]: I1204 15:24:05.393702 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35d3a3f-f614-45fa-a59a-e5cefa471321" path="/var/lib/kubelet/pods/d35d3a3f-f614-45fa-a59a-e5cefa471321/volumes" Dec 04 15:24:06 crc kubenswrapper[4676]: I1204 15:24:06.326180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" event={"ID":"20f45de0-b2ef-4fd6-9cf8-b79a0c831563","Type":"ContainerStarted","Data":"41dc0afddc85a394fe7601cf39625d64bd11cf48724adaafaeba70bcdc82ed6c"} Dec 04 15:24:06 crc kubenswrapper[4676]: I1204 15:24:06.326281 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" event={"ID":"20f45de0-b2ef-4fd6-9cf8-b79a0c831563","Type":"ContainerStarted","Data":"cf3716c089a67af05e9a5b032299f57de6da7ce3f53db422beaf2c43fe71608a"} Dec 04 15:24:06 crc kubenswrapper[4676]: I1204 15:24:06.326438 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:06 crc kubenswrapper[4676]: I1204 15:24:06.335429 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" Dec 04 15:24:06 crc kubenswrapper[4676]: I1204 15:24:06.351000 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-f58fb8db6-2wvqm" podStartSLOduration=28.350973517 podStartE2EDuration="28.350973517s" podCreationTimestamp="2025-12-04 15:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:24:06.349414203 +0000 UTC m=+253.784084060" watchObservedRunningTime="2025-12-04 15:24:06.350973517 +0000 UTC m=+253.785643374" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.203050 4676 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.204508 4676 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.204654 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.204840 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9" gracePeriod=15 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.204853 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27" gracePeriod=15 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205024 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f" gracePeriod=15 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205051 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e" gracePeriod=15 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.204950 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c" gracePeriod=15 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205227 4676 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.205695 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205714 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.205726 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205735 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.205775 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205786 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.205798 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205805 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.205820 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205851 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.205872 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205880 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.205888 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.205895 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206126 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206146 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206182 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206191 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206198 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206206 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.206388 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206403 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.206562 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.248863 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.249047 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.249100 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.249170 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.249262 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.249309 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.249385 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.249411 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.261558 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.347068 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.348644 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.349413 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27" exitCode=0 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.349449 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e" exitCode=0 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.349460 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c" exitCode=0 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.349484 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f" exitCode=2 Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.349523 4676 scope.go:117] "RemoveContainer" containerID="fa439557999e2f846b3972014edf0cbc511e9f8d6d8ca530c6472cba9e87fa58" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350227 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350303 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350378 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350471 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350500 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350541 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350585 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350611 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350631 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350643 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350668 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350692 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350501 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350746 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350668 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.350838 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: I1204 15:24:09.558325 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:24:09 crc kubenswrapper[4676]: W1204 15:24:09.594133 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-eab5dbcf538e1d8508b9f7627ecd654aa678e33ff61dce7dad1bce10f2677e6e WatchSource:0}: Error finding container eab5dbcf538e1d8508b9f7627ecd654aa678e33ff61dce7dad1bce10f2677e6e: Status 404 returned error can't find the container with id eab5dbcf538e1d8508b9f7627ecd654aa678e33ff61dce7dad1bce10f2677e6e Dec 04 15:24:09 crc kubenswrapper[4676]: E1204 15:24:09.599147 4676 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e0c8066f19f2b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 15:24:09.598271275 +0000 UTC m=+257.032941132,LastTimestamp:2025-12-04 15:24:09.598271275 +0000 UTC m=+257.032941132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.360478 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.364927 4676 generic.go:334] "Generic (PLEG): container finished" podID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" containerID="820f10ad00e9750187ce7ae5b48c49f6ca7f46efa860f86ac7ffdad0eb58cd97" exitCode=0 Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.365100 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b378ae9f-e6e9-4e71-8fb4-56d6239599eb","Type":"ContainerDied","Data":"820f10ad00e9750187ce7ae5b48c49f6ca7f46efa860f86ac7ffdad0eb58cd97"} Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.366173 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.366378 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.367268 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1abf37bb2c7b0b1bf11ac3886352b968c5414ffed36c5f6d20ccdd2a439eba83"} Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.367303 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"eab5dbcf538e1d8508b9f7627ecd654aa678e33ff61dce7dad1bce10f2677e6e"} Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.367649 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:10 crc kubenswrapper[4676]: I1204 15:24:10.367871 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:10 crc kubenswrapper[4676]: E1204 15:24:10.437474 4676 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" volumeName="registry-storage" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.669662 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.670924 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.671596 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.784730 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kube-api-access\") pod \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.784998 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kubelet-dir\") pod \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.785125 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-var-lock\") pod \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\" (UID: \"b378ae9f-e6e9-4e71-8fb4-56d6239599eb\") " Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.785321 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b378ae9f-e6e9-4e71-8fb4-56d6239599eb" (UID: "b378ae9f-e6e9-4e71-8fb4-56d6239599eb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.785468 4676 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.785520 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-var-lock" (OuterVolumeSpecName: "var-lock") pod "b378ae9f-e6e9-4e71-8fb4-56d6239599eb" (UID: "b378ae9f-e6e9-4e71-8fb4-56d6239599eb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.791111 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b378ae9f-e6e9-4e71-8fb4-56d6239599eb" (UID: "b378ae9f-e6e9-4e71-8fb4-56d6239599eb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.886741 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.886797 4676 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b378ae9f-e6e9-4e71-8fb4-56d6239599eb-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:11 crc kubenswrapper[4676]: E1204 15:24:11.941350 4676 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:11 crc kubenswrapper[4676]: E1204 15:24:11.942013 4676 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:11 crc kubenswrapper[4676]: E1204 15:24:11.942673 4676 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:11 crc kubenswrapper[4676]: E1204 15:24:11.943045 4676 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:11 crc kubenswrapper[4676]: E1204 15:24:11.943340 4676 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:11 crc kubenswrapper[4676]: I1204 15:24:11.943384 4676 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 04 15:24:11 crc kubenswrapper[4676]: E1204 15:24:11.943829 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Dec 04 15:24:12 crc kubenswrapper[4676]: E1204 15:24:12.145350 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.386292 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.387392 4676 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9" exitCode=0 Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.390644 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"b378ae9f-e6e9-4e71-8fb4-56d6239599eb","Type":"ContainerDied","Data":"aaea9132cc8de043236e20f0d19f5f59cc9f16c17dde2ee8cfb1d4a657d240bc"} Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.390677 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaea9132cc8de043236e20f0d19f5f59cc9f16c17dde2ee8cfb1d4a657d240bc" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.390838 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.406555 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.406816 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.483532 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.484826 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.485464 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.485735 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.486057 4676 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:12 crc kubenswrapper[4676]: E1204 15:24:12.547204 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.597836 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598005 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598035 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598028 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598089 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598163 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598576 4676 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598598 4676 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:12 crc kubenswrapper[4676]: I1204 15:24:12.598609 4676 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:24:13 crc kubenswrapper[4676]: E1204 15:24:13.348277 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.387336 4676 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.388078 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.388711 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.393269 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.402028 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.402886 4676 scope.go:117] "RemoveContainer" containerID="46a43b8645ebf5804042fcde5f031d08aa6cea36f7a6bec8e19c58b7e5fcbd27" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.403109 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.404050 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.404595 4676 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.404888 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.412143 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.413023 4676 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.413716 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.421604 4676 scope.go:117] "RemoveContainer" containerID="e53e1fa876152a24e96357b3840f331fde3ed86e8972798953a45c898dd8439e" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.433909 4676 scope.go:117] "RemoveContainer" containerID="3f5225cbe0e40cbce69831cc2c52c18cf4ff64defd80d9fb4b3aeb75baa0ed0c" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.456186 4676 scope.go:117] "RemoveContainer" containerID="41ae96e98b8083bb7ce64cb6c019d0c8aa78be7990e704b40b5f97718b86576f" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.469438 4676 scope.go:117] "RemoveContainer" containerID="c5c326293d1d8a1f3be453faad01716344b73e0974bfa84efc029d2e55107ba9" Dec 04 15:24:13 crc kubenswrapper[4676]: I1204 15:24:13.484576 4676 scope.go:117] "RemoveContainer" containerID="a36979d3e32d37df4b94af42432961abeeda5012ecb00eb1c6557cb5f9fce72e" Dec 04 15:24:14 crc kubenswrapper[4676]: E1204 15:24:14.950061 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Dec 04 15:24:16 crc kubenswrapper[4676]: E1204 15:24:16.288000 4676 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187e0c8066f19f2b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-04 15:24:09.598271275 +0000 UTC m=+257.032941132,LastTimestamp:2025-12-04 15:24:09.598271275 +0000 UTC m=+257.032941132,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 04 15:24:18 crc kubenswrapper[4676]: E1204 15:24:18.151492 4676 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="6.4s" Dec 04 15:24:19 crc kubenswrapper[4676]: E1204 15:24:19.261300 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:24:19Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:24:19Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:24:19Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-04T15:24:19Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:19 crc kubenswrapper[4676]: E1204 15:24:19.262767 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:19 crc kubenswrapper[4676]: E1204 15:24:19.263344 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:19 crc kubenswrapper[4676]: E1204 15:24:19.263700 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:19 crc kubenswrapper[4676]: E1204 15:24:19.264024 4676 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:19 crc kubenswrapper[4676]: E1204 15:24:19.264051 4676 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 04 15:24:20 crc kubenswrapper[4676]: I1204 15:24:20.383525 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:20 crc kubenswrapper[4676]: I1204 15:24:20.384480 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:20 crc kubenswrapper[4676]: I1204 15:24:20.385063 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:20 crc kubenswrapper[4676]: I1204 15:24:20.405026 4676 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:20 crc kubenswrapper[4676]: I1204 15:24:20.406201 4676 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:20 crc kubenswrapper[4676]: E1204 15:24:20.406665 4676 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:20 crc kubenswrapper[4676]: I1204 15:24:20.407338 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:20 crc kubenswrapper[4676]: W1204 15:24:20.430965 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-1e9c288c6c2ea4476c198d649a8c36622aa1c1fa8bb23290023025f30803574f WatchSource:0}: Error finding container 1e9c288c6c2ea4476c198d649a8c36622aa1c1fa8bb23290023025f30803574f: Status 404 returned error can't find the container with id 1e9c288c6c2ea4476c198d649a8c36622aa1c1fa8bb23290023025f30803574f Dec 04 15:24:20 crc kubenswrapper[4676]: I1204 15:24:20.521400 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1e9c288c6c2ea4476c198d649a8c36622aa1c1fa8bb23290023025f30803574f"} Dec 04 15:24:21 crc kubenswrapper[4676]: I1204 15:24:21.528325 4676 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e93efce17d54690e91f7966fbd314335e66bab7519769a1fdc55f5a43b1343a9" exitCode=0 Dec 04 15:24:21 crc kubenswrapper[4676]: I1204 15:24:21.528398 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e93efce17d54690e91f7966fbd314335e66bab7519769a1fdc55f5a43b1343a9"} Dec 04 15:24:21 crc kubenswrapper[4676]: I1204 15:24:21.528603 4676 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:21 crc kubenswrapper[4676]: I1204 15:24:21.528618 4676 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:21 crc kubenswrapper[4676]: E1204 15:24:21.528974 4676 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:21 crc kubenswrapper[4676]: I1204 15:24:21.528981 4676 status_manager.go:851] "Failed to get status for pod" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:21 crc kubenswrapper[4676]: I1204 15:24:21.529265 4676 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Dec 04 15:24:22 crc kubenswrapper[4676]: I1204 15:24:22.542724 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8532b9207bee0d9b38e5d53e2fbb23ed3bcb0167c8ad73e96c499981a4d065f9"} Dec 04 15:24:22 crc kubenswrapper[4676]: I1204 15:24:22.542769 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5506125908fe614a46fe89892711bfb106eef216970984109161aebd861a762f"} Dec 04 15:24:22 crc kubenswrapper[4676]: I1204 15:24:22.542781 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1a32e11b318a5cba08e5af4012ee15f012569eb2720ea80c4c5cf755b31ee9c6"} Dec 04 15:24:23 crc kubenswrapper[4676]: I1204 15:24:23.554420 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"692ad80fe141d1ae5f70f4912f8c6f04075eff8035d88e1080148c0972c4a583"} Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.562574 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.562634 4676 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86" exitCode=1 Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.562700 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86"} Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.563304 4676 scope.go:117] "RemoveContainer" containerID="8bed7c504540fd364abe98633e1f5692b4cfa6f1dd63d59ea1cc44f0f3ffdc86" Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.568193 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d334fc8068e0e2aa330872c97ede2a7277c58b7f6b91e741c4189137c3847e01"} Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.568423 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.568531 4676 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:24 crc kubenswrapper[4676]: I1204 15:24:24.568561 4676 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:25 crc kubenswrapper[4676]: I1204 15:24:25.408383 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:25 crc kubenswrapper[4676]: I1204 15:24:25.409149 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:25 crc kubenswrapper[4676]: I1204 15:24:25.421190 4676 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]log ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]etcd ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/generic-apiserver-start-informers ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/priority-and-fairness-filter ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-apiextensions-informers ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-apiextensions-controllers ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/crd-informer-synced ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-system-namespaces-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 04 15:24:25 crc kubenswrapper[4676]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 04 15:24:25 crc kubenswrapper[4676]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/bootstrap-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/start-kube-aggregator-informers ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/apiservice-registration-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/apiservice-discovery-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]autoregister-completion ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/apiservice-openapi-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 04 15:24:25 crc kubenswrapper[4676]: livez check failed Dec 04 15:24:25 crc kubenswrapper[4676]: I1204 15:24:25.421311 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:24:25 crc kubenswrapper[4676]: I1204 15:24:25.577834 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 04 15:24:25 crc kubenswrapper[4676]: I1204 15:24:25.577937 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dd29485e5c39796cecd8633d3bad9544059618379c098b3bb4da6d42276ffa4c"} Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.271521 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.271885 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.271980 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.272032 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.274150 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.274282 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.274656 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.283230 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.283833 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.291203 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.296147 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.296276 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.506345 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.525587 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 04 15:24:27 crc kubenswrapper[4676]: I1204 15:24:27.531471 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:24:27 crc kubenswrapper[4676]: W1204 15:24:27.941887 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-f8018442219cfdf7a4ce0c8b1de818034725aaf48e2dc689ee5d19a7def66b18 WatchSource:0}: Error finding container f8018442219cfdf7a4ce0c8b1de818034725aaf48e2dc689ee5d19a7def66b18: Status 404 returned error can't find the container with id f8018442219cfdf7a4ce0c8b1de818034725aaf48e2dc689ee5d19a7def66b18 Dec 04 15:24:28 crc kubenswrapper[4676]: W1204 15:24:28.015987 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-623cdfe62a66d41cd660238b38030975dd2c3b76ec4fa469a38a5c06f2073f16 WatchSource:0}: Error finding container 623cdfe62a66d41cd660238b38030975dd2c3b76ec4fa469a38a5c06f2073f16: Status 404 returned error can't find the container with id 623cdfe62a66d41cd660238b38030975dd2c3b76ec4fa469a38a5c06f2073f16 Dec 04 15:24:28 crc kubenswrapper[4676]: W1204 15:24:28.112679 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9df35e6196d322aa40b82fdc1592b25eb82d8bab767b21ce178e24e1ad965443 WatchSource:0}: Error finding container 9df35e6196d322aa40b82fdc1592b25eb82d8bab767b21ce178e24e1ad965443: Status 404 returned error can't find the container with id 9df35e6196d322aa40b82fdc1592b25eb82d8bab767b21ce178e24e1ad965443 Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.223424 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.227137 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.595659 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6775b2bf0e6f174099c4b73b757c53a194c87495ff56c3b05e45850e681c974f"} Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.595724 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9df35e6196d322aa40b82fdc1592b25eb82d8bab767b21ce178e24e1ad965443"} Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.595898 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.598396 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"cb66198e6031d3519219f47a7fdf70c912c9131a0a752a9eaf7366f40d3dc600"} Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.598435 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"623cdfe62a66d41cd660238b38030975dd2c3b76ec4fa469a38a5c06f2073f16"} Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.600263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"3589159fd6e7998cf8a1b6eba511f574a1282917fa5df3332dbfd70332ab481e"} Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.600307 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f8018442219cfdf7a4ce0c8b1de818034725aaf48e2dc689ee5d19a7def66b18"} Dec 04 15:24:28 crc kubenswrapper[4676]: I1204 15:24:28.600511 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:24:29 crc kubenswrapper[4676]: I1204 15:24:29.606819 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 04 15:24:29 crc kubenswrapper[4676]: I1204 15:24:29.607133 4676 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="3589159fd6e7998cf8a1b6eba511f574a1282917fa5df3332dbfd70332ab481e" exitCode=255 Dec 04 15:24:29 crc kubenswrapper[4676]: I1204 15:24:29.607229 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"3589159fd6e7998cf8a1b6eba511f574a1282917fa5df3332dbfd70332ab481e"} Dec 04 15:24:29 crc kubenswrapper[4676]: I1204 15:24:29.607831 4676 scope.go:117] "RemoveContainer" containerID="3589159fd6e7998cf8a1b6eba511f574a1282917fa5df3332dbfd70332ab481e" Dec 04 15:24:29 crc kubenswrapper[4676]: I1204 15:24:29.801264 4676 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:29 crc kubenswrapper[4676]: I1204 15:24:29.854830 4676 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15abca56-0391-4057-9a04-ddf488cee9aa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:24:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:24:21Z\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-04T15:24:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-cert-syncer kube-apiserver-cert-regeneration-controller kube-apiserver-insecure-readyz kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}}}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e93efce17d54690e91f7966fbd314335e66bab7519769a1fdc55f5a43b1343a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e93efce17d54690e91f7966fbd314335e66bab7519769a1fdc55f5a43b1343a9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-04T15:24:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-04T15:24:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Pending\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Pod \"kube-apiserver-crc\" is invalid: metadata.uid: Invalid value: \"15abca56-0391-4057-9a04-ddf488cee9aa\": field is immutable" Dec 04 15:24:29 crc kubenswrapper[4676]: I1204 15:24:29.895679 4676 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d09ea563-b989-45fe-b250-25975672da0a" Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.615193 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.616019 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.616074 4676 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="731c372002a9203f86885b86541cab2a0f91e16a33a84e5bf251bfe4af41d5b9" exitCode=255 Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.616112 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"731c372002a9203f86885b86541cab2a0f91e16a33a84e5bf251bfe4af41d5b9"} Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.616169 4676 scope.go:117] "RemoveContainer" containerID="3589159fd6e7998cf8a1b6eba511f574a1282917fa5df3332dbfd70332ab481e" Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.616394 4676 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.616408 4676 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="15abca56-0391-4057-9a04-ddf488cee9aa" Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.616732 4676 scope.go:117] "RemoveContainer" containerID="731c372002a9203f86885b86541cab2a0f91e16a33a84e5bf251bfe4af41d5b9" Dec 04 15:24:30 crc kubenswrapper[4676]: E1204 15:24:30.617050 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:24:30 crc kubenswrapper[4676]: I1204 15:24:30.620572 4676 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d09ea563-b989-45fe-b250-25975672da0a" Dec 04 15:24:31 crc kubenswrapper[4676]: I1204 15:24:31.626379 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 04 15:24:40 crc kubenswrapper[4676]: I1204 15:24:40.769353 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 04 15:24:40 crc kubenswrapper[4676]: I1204 15:24:40.771756 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 04 15:24:40 crc kubenswrapper[4676]: I1204 15:24:40.887695 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 04 15:24:41 crc kubenswrapper[4676]: I1204 15:24:41.101491 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 15:24:41 crc kubenswrapper[4676]: I1204 15:24:41.416679 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 04 15:24:41 crc kubenswrapper[4676]: I1204 15:24:41.500372 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 04 15:24:41 crc kubenswrapper[4676]: I1204 15:24:41.828081 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 04 15:24:41 crc kubenswrapper[4676]: I1204 15:24:41.924526 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.185742 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.217977 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.364849 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.372641 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.485256 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.499311 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.540587 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.579011 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.629154 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.799706 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 04 15:24:42 crc kubenswrapper[4676]: I1204 15:24:42.811279 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.081710 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.202781 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.202966 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.263944 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.273431 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.288792 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.394654 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.398601 4676 scope.go:117] "RemoveContainer" containerID="731c372002a9203f86885b86541cab2a0f91e16a33a84e5bf251bfe4af41d5b9" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.511112 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.541533 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.633819 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.713761 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.763203 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.763268 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"86e788d277c34163289c7b3e9a64e32ddd49550270151663fc28a682c7cbbb9a"} Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.957413 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 04 15:24:43 crc kubenswrapper[4676]: I1204 15:24:43.959708 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.112691 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.126950 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.207834 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.326263 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.616720 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.635487 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.659169 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.769379 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.770071 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.770117 4676 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="86e788d277c34163289c7b3e9a64e32ddd49550270151663fc28a682c7cbbb9a" exitCode=255 Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.770146 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"86e788d277c34163289c7b3e9a64e32ddd49550270151663fc28a682c7cbbb9a"} Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.770178 4676 scope.go:117] "RemoveContainer" containerID="731c372002a9203f86885b86541cab2a0f91e16a33a84e5bf251bfe4af41d5b9" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.771121 4676 scope.go:117] "RemoveContainer" containerID="86e788d277c34163289c7b3e9a64e32ddd49550270151663fc28a682c7cbbb9a" Dec 04 15:24:44 crc kubenswrapper[4676]: E1204 15:24:44.771435 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.825756 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.863207 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 04 15:24:44 crc kubenswrapper[4676]: I1204 15:24:44.998838 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.117022 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.161348 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.288727 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.315355 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.458510 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.460743 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.483602 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.484416 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.501111 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.517183 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.533525 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.537522 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.585424 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.633585 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.724926 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.773715 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.777533 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.782765 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.891288 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.893032 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.932833 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.961750 4676 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 04 15:24:45 crc kubenswrapper[4676]: I1204 15:24:45.977554 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.041620 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.077378 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.177164 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.189331 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.205131 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.212789 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.239515 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.315791 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.512025 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.543542 4676 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.574262 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.576267 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.589510 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.627298 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.639719 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.677038 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.718588 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.721768 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.783283 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.847503 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.920546 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.982625 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 15:24:46 crc kubenswrapper[4676]: I1204 15:24:46.983166 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.024710 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.073446 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.431231 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.437482 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.437543 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.437765 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.444542 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.445831 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.446107 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.446390 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.446465 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.673670 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.711109 4676 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 04 15:24:47 crc kubenswrapper[4676]: I1204 15:24:47.797327 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.022140 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.071470 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.142171 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.178568 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.180592 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.205619 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.294890 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.496053 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.545271 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.686255 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.697464 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.778343 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.926366 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 04 15:24:48 crc kubenswrapper[4676]: I1204 15:24:48.938276 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.046031 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.062438 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.135648 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.155592 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.178713 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.193018 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.269882 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.407754 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.463880 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.469472 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.522433 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.585974 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.604173 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.618436 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.673345 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.707070 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.751812 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.757332 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 04 15:24:49 crc kubenswrapper[4676]: I1204 15:24:49.891785 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.016545 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.172584 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.203428 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.251396 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.412274 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.417791 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.574954 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.598997 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.599230 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.599400 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.599578 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.599899 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.600142 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.603857 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.614507 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.615135 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.742875 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.752674 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.761692 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.774612 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.883242 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 04 15:24:50 crc kubenswrapper[4676]: I1204 15:24:50.946652 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.025800 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.094977 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.151258 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.166013 4676 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.176494 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.177178 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.356741 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.395162 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.422738 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.450475 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.591607 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.622023 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.641534 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.784218 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.791285 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.878935 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.897182 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.900384 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 04 15:24:51 crc kubenswrapper[4676]: I1204 15:24:51.944639 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.031482 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.083021 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.193159 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.222285 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.229750 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.244148 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.294184 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.325803 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.377645 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.580843 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.584737 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.585240 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.585393 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.586216 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.599849 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.759574 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.818488 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.856614 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.968700 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 04 15:24:52 crc kubenswrapper[4676]: I1204 15:24:52.998045 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.013533 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.111073 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.260348 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.264660 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.342626 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.379352 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.434168 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.447009 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.589250 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.659659 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.661077 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.704502 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.704793 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.770280 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.802979 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 04 15:24:53 crc kubenswrapper[4676]: I1204 15:24:53.920825 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.047251 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.067423 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.069238 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.328306 4676 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.334229 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.334196113 podStartE2EDuration="45.334196113s" podCreationTimestamp="2025-12-04 15:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:24:29.89147691 +0000 UTC m=+277.326146777" watchObservedRunningTime="2025-12-04 15:24:54.334196113 +0000 UTC m=+301.768865970" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.335769 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.335824 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.336191 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.341230 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.359765 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.359742038 podStartE2EDuration="25.359742038s" podCreationTimestamp="2025-12-04 15:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:24:54.355193585 +0000 UTC m=+301.789863442" watchObservedRunningTime="2025-12-04 15:24:54.359742038 +0000 UTC m=+301.794411895" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.412694 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.426682 4676 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.428268 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.655516 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.763223 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.813422 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.866376 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 04 15:24:54 crc kubenswrapper[4676]: I1204 15:24:54.948706 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.091702 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.103577 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.146879 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.169864 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.215119 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.240890 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.270621 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.414299 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.431491 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.517403 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.527685 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.633260 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.699228 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.740087 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.840026 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 04 15:24:55 crc kubenswrapper[4676]: I1204 15:24:55.851674 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 04 15:24:56 crc kubenswrapper[4676]: I1204 15:24:56.186366 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 04 15:24:56 crc kubenswrapper[4676]: I1204 15:24:56.317875 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 04 15:24:56 crc kubenswrapper[4676]: I1204 15:24:56.484681 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 04 15:24:56 crc kubenswrapper[4676]: I1204 15:24:56.569234 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 04 15:24:56 crc kubenswrapper[4676]: I1204 15:24:56.685604 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 04 15:24:56 crc kubenswrapper[4676]: I1204 15:24:56.747795 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 04 15:24:56 crc kubenswrapper[4676]: I1204 15:24:56.969759 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 15:24:57 crc kubenswrapper[4676]: I1204 15:24:57.270382 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 04 15:24:57 crc kubenswrapper[4676]: I1204 15:24:57.385757 4676 scope.go:117] "RemoveContainer" containerID="86e788d277c34163289c7b3e9a64e32ddd49550270151663fc28a682c7cbbb9a" Dec 04 15:24:57 crc kubenswrapper[4676]: E1204 15:24:57.385998 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 04 15:24:57 crc kubenswrapper[4676]: I1204 15:24:57.911387 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 04 15:24:58 crc kubenswrapper[4676]: I1204 15:24:58.642601 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 04 15:24:59 crc kubenswrapper[4676]: I1204 15:24:59.673158 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 15:25:02 crc kubenswrapper[4676]: I1204 15:25:02.657136 4676 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 15:25:02 crc kubenswrapper[4676]: I1204 15:25:02.657722 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://1abf37bb2c7b0b1bf11ac3886352b968c5414ffed36c5f6d20ccdd2a439eba83" gracePeriod=5 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.020133 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zd784"] Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.020835 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zd784" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="registry-server" containerID="cri-o://f6f46434e5c90eea329a9cbe61386e51ae33c5cd53ef7a27a95fa3359b018b16" gracePeriod=30 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.042495 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml7rm"] Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.042927 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ml7rm" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="registry-server" containerID="cri-o://d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77" gracePeriod=30 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.049238 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4627g"] Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.049555 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" podUID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" containerName="marketplace-operator" containerID="cri-o://4b0687231ef46f1df1ea3301976b5482d48f8ffea2f118b12df9738514bf5a3a" gracePeriod=30 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.070817 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2brr7"] Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.071363 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2brr7" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="registry-server" containerID="cri-o://88443358f044707098a6957bb486390af8265efb28696463f962f7bd7cffa00b" gracePeriod=30 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.094274 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tx6hs"] Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.094819 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tx6hs" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="registry-server" containerID="cri-o://776decc324e4c9692ffb753724e2d065ae05bbf134d25d0e9d8887210b226df0" gracePeriod=30 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.101126 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgxsk"] Dec 04 15:25:05 crc kubenswrapper[4676]: E1204 15:25:05.110764 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.110817 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 15:25:05 crc kubenswrapper[4676]: E1204 15:25:05.110851 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" containerName="installer" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.110862 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" containerName="installer" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.111053 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b378ae9f-e6e9-4e71-8fb4-56d6239599eb" containerName="installer" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.111064 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.111987 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.113083 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgxsk"] Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.300431 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84e2ecfe-0652-42eb-9440-0b03a4722150-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.300532 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e2ecfe-0652-42eb-9440-0b03a4722150-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.300552 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bqhq\" (UniqueName: \"kubernetes.io/projected/84e2ecfe-0652-42eb-9440-0b03a4722150-kube-api-access-7bqhq\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.401961 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e2ecfe-0652-42eb-9440-0b03a4722150-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.402591 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bqhq\" (UniqueName: \"kubernetes.io/projected/84e2ecfe-0652-42eb-9440-0b03a4722150-kube-api-access-7bqhq\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.403147 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84e2ecfe-0652-42eb-9440-0b03a4722150-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.403609 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84e2ecfe-0652-42eb-9440-0b03a4722150-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.417975 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/84e2ecfe-0652-42eb-9440-0b03a4722150-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.422111 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bqhq\" (UniqueName: \"kubernetes.io/projected/84e2ecfe-0652-42eb-9440-0b03a4722150-kube-api-access-7bqhq\") pod \"marketplace-operator-79b997595-jgxsk\" (UID: \"84e2ecfe-0652-42eb-9440-0b03a4722150\") " pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.430363 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.641343 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jgxsk"] Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.892649 4676 generic.go:334] "Generic (PLEG): container finished" podID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerID="f6f46434e5c90eea329a9cbe61386e51ae33c5cd53ef7a27a95fa3359b018b16" exitCode=0 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.892742 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd784" event={"ID":"009171f0-c033-4ea6-b46d-0155fe9f3e71","Type":"ContainerDied","Data":"f6f46434e5c90eea329a9cbe61386e51ae33c5cd53ef7a27a95fa3359b018b16"} Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.896113 4676 generic.go:334] "Generic (PLEG): container finished" podID="1aa95312-1f71-4167-9982-352d67b49f03" containerID="776decc324e4c9692ffb753724e2d065ae05bbf134d25d0e9d8887210b226df0" exitCode=0 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.896168 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx6hs" event={"ID":"1aa95312-1f71-4167-9982-352d67b49f03","Type":"ContainerDied","Data":"776decc324e4c9692ffb753724e2d065ae05bbf134d25d0e9d8887210b226df0"} Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.899011 4676 generic.go:334] "Generic (PLEG): container finished" podID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerID="88443358f044707098a6957bb486390af8265efb28696463f962f7bd7cffa00b" exitCode=0 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.899063 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2brr7" event={"ID":"131c312c-f19d-4e87-8f86-8d38926b2d87","Type":"ContainerDied","Data":"88443358f044707098a6957bb486390af8265efb28696463f962f7bd7cffa00b"} Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.900054 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" event={"ID":"84e2ecfe-0652-42eb-9440-0b03a4722150","Type":"ContainerStarted","Data":"585d93cfb0f56592090ccba1fecafc6a4da35b3a7a921a864a93505a190687a2"} Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.901323 4676 generic.go:334] "Generic (PLEG): container finished" podID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" containerID="4b0687231ef46f1df1ea3301976b5482d48f8ffea2f118b12df9738514bf5a3a" exitCode=0 Dec 04 15:25:05 crc kubenswrapper[4676]: I1204 15:25:05.901352 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" event={"ID":"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d","Type":"ContainerDied","Data":"4b0687231ef46f1df1ea3301976b5482d48f8ffea2f118b12df9738514bf5a3a"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.216533 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.354806 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-catalog-content\") pod \"1aa95312-1f71-4167-9982-352d67b49f03\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.354865 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmdst\" (UniqueName: \"kubernetes.io/projected/1aa95312-1f71-4167-9982-352d67b49f03-kube-api-access-wmdst\") pod \"1aa95312-1f71-4167-9982-352d67b49f03\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.378205 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa95312-1f71-4167-9982-352d67b49f03-kube-api-access-wmdst" (OuterVolumeSpecName: "kube-api-access-wmdst") pod "1aa95312-1f71-4167-9982-352d67b49f03" (UID: "1aa95312-1f71-4167-9982-352d67b49f03"). InnerVolumeSpecName "kube-api-access-wmdst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.417521 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.446565 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458474 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-utilities\") pod \"009171f0-c033-4ea6-b46d-0155fe9f3e71\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458530 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-utilities\") pod \"1aa95312-1f71-4167-9982-352d67b49f03\" (UID: \"1aa95312-1f71-4167-9982-352d67b49f03\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458561 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmlns\" (UniqueName: \"kubernetes.io/projected/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-kube-api-access-jmlns\") pod \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458610 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbc79\" (UniqueName: \"kubernetes.io/projected/009171f0-c033-4ea6-b46d-0155fe9f3e71-kube-api-access-wbc79\") pod \"009171f0-c033-4ea6-b46d-0155fe9f3e71\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458642 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-operator-metrics\") pod \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458671 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-trusted-ca\") pod \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\" (UID: \"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458705 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-catalog-content\") pod \"009171f0-c033-4ea6-b46d-0155fe9f3e71\" (UID: \"009171f0-c033-4ea6-b46d-0155fe9f3e71\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.458922 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmdst\" (UniqueName: \"kubernetes.io/projected/1aa95312-1f71-4167-9982-352d67b49f03-kube-api-access-wmdst\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.460226 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-utilities" (OuterVolumeSpecName: "utilities") pod "009171f0-c033-4ea6-b46d-0155fe9f3e71" (UID: "009171f0-c033-4ea6-b46d-0155fe9f3e71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.460953 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-utilities" (OuterVolumeSpecName: "utilities") pod "1aa95312-1f71-4167-9982-352d67b49f03" (UID: "1aa95312-1f71-4167-9982-352d67b49f03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.463542 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" (UID: "8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.474206 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009171f0-c033-4ea6-b46d-0155fe9f3e71-kube-api-access-wbc79" (OuterVolumeSpecName: "kube-api-access-wbc79") pod "009171f0-c033-4ea6-b46d-0155fe9f3e71" (UID: "009171f0-c033-4ea6-b46d-0155fe9f3e71"). InnerVolumeSpecName "kube-api-access-wbc79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.474728 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" (UID: "8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.475449 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-kube-api-access-jmlns" (OuterVolumeSpecName: "kube-api-access-jmlns") pod "8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" (UID: "8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d"). InnerVolumeSpecName "kube-api-access-jmlns". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.529214 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "009171f0-c033-4ea6-b46d-0155fe9f3e71" (UID: "009171f0-c033-4ea6-b46d-0155fe9f3e71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.532308 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1aa95312-1f71-4167-9982-352d67b49f03" (UID: "1aa95312-1f71-4167-9982-352d67b49f03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560107 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbc79\" (UniqueName: \"kubernetes.io/projected/009171f0-c033-4ea6-b46d-0155fe9f3e71-kube-api-access-wbc79\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560156 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560166 4676 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560208 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560217 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/009171f0-c033-4ea6-b46d-0155fe9f3e71-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560227 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560235 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmlns\" (UniqueName: \"kubernetes.io/projected/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d-kube-api-access-jmlns\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.560244 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1aa95312-1f71-4167-9982-352d67b49f03-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.671581 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.682953 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.762174 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-utilities\") pod \"a945f156-c10a-4132-8fb4-e43040790a01\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.762265 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-utilities\") pod \"131c312c-f19d-4e87-8f86-8d38926b2d87\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.762294 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-catalog-content\") pod \"a945f156-c10a-4132-8fb4-e43040790a01\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.762319 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7llgq\" (UniqueName: \"kubernetes.io/projected/a945f156-c10a-4132-8fb4-e43040790a01-kube-api-access-7llgq\") pod \"a945f156-c10a-4132-8fb4-e43040790a01\" (UID: \"a945f156-c10a-4132-8fb4-e43040790a01\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.762344 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bw5n\" (UniqueName: \"kubernetes.io/projected/131c312c-f19d-4e87-8f86-8d38926b2d87-kube-api-access-2bw5n\") pod \"131c312c-f19d-4e87-8f86-8d38926b2d87\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.762425 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-catalog-content\") pod \"131c312c-f19d-4e87-8f86-8d38926b2d87\" (UID: \"131c312c-f19d-4e87-8f86-8d38926b2d87\") " Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.763755 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-utilities" (OuterVolumeSpecName: "utilities") pod "131c312c-f19d-4e87-8f86-8d38926b2d87" (UID: "131c312c-f19d-4e87-8f86-8d38926b2d87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.764687 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-utilities" (OuterVolumeSpecName: "utilities") pod "a945f156-c10a-4132-8fb4-e43040790a01" (UID: "a945f156-c10a-4132-8fb4-e43040790a01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.765858 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131c312c-f19d-4e87-8f86-8d38926b2d87-kube-api-access-2bw5n" (OuterVolumeSpecName: "kube-api-access-2bw5n") pod "131c312c-f19d-4e87-8f86-8d38926b2d87" (UID: "131c312c-f19d-4e87-8f86-8d38926b2d87"). InnerVolumeSpecName "kube-api-access-2bw5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.767643 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a945f156-c10a-4132-8fb4-e43040790a01-kube-api-access-7llgq" (OuterVolumeSpecName: "kube-api-access-7llgq") pod "a945f156-c10a-4132-8fb4-e43040790a01" (UID: "a945f156-c10a-4132-8fb4-e43040790a01"). InnerVolumeSpecName "kube-api-access-7llgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.800456 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "131c312c-f19d-4e87-8f86-8d38926b2d87" (UID: "131c312c-f19d-4e87-8f86-8d38926b2d87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.834234 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a945f156-c10a-4132-8fb4-e43040790a01" (UID: "a945f156-c10a-4132-8fb4-e43040790a01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.863734 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.863778 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.863791 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/131c312c-f19d-4e87-8f86-8d38926b2d87-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.863800 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a945f156-c10a-4132-8fb4-e43040790a01-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.863810 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7llgq\" (UniqueName: \"kubernetes.io/projected/a945f156-c10a-4132-8fb4-e43040790a01-kube-api-access-7llgq\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.863823 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bw5n\" (UniqueName: \"kubernetes.io/projected/131c312c-f19d-4e87-8f86-8d38926b2d87-kube-api-access-2bw5n\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.908796 4676 generic.go:334] "Generic (PLEG): container finished" podID="a945f156-c10a-4132-8fb4-e43040790a01" containerID="d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77" exitCode=0 Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.908881 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml7rm" event={"ID":"a945f156-c10a-4132-8fb4-e43040790a01","Type":"ContainerDied","Data":"d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.908963 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml7rm" event={"ID":"a945f156-c10a-4132-8fb4-e43040790a01","Type":"ContainerDied","Data":"c513391450352cf3180e27eedb6d41b560cdf631d4e0b4d0fc7395392a0de392"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.908982 4676 scope.go:117] "RemoveContainer" containerID="d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.909286 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml7rm" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.911175 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2brr7" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.911174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2brr7" event={"ID":"131c312c-f19d-4e87-8f86-8d38926b2d87","Type":"ContainerDied","Data":"04afd6b7ce3f5fddf49f379e8307f30d8def9163c18cc9f7dad187121b53754e"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.913733 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" event={"ID":"84e2ecfe-0652-42eb-9440-0b03a4722150","Type":"ContainerStarted","Data":"7e4cb1b2074ffef4b2b638110ef354cae4a0c162f79a87e8877908e549ce21f5"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.913940 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.918933 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" event={"ID":"8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d","Type":"ContainerDied","Data":"31c81bf182410af48f2ab29fd61cf1d7bde863858809722c8014d2d137706828"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.918977 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4627g" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.925150 4676 scope.go:117] "RemoveContainer" containerID="cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.932292 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.943336 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zd784" event={"ID":"009171f0-c033-4ea6-b46d-0155fe9f3e71","Type":"ContainerDied","Data":"bccdb6b590f45ff8071e3c1756f0a461afee34fa14da31ed2fe4179bfc338d08"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.943355 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zd784" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.946444 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tx6hs" event={"ID":"1aa95312-1f71-4167-9982-352d67b49f03","Type":"ContainerDied","Data":"5725bf4099cdb8385d3f8d0d566605866978fb3279a281b1edd37781f03148dc"} Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.946493 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tx6hs" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.975510 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jgxsk" podStartSLOduration=1.9754914270000001 podStartE2EDuration="1.975491427s" podCreationTimestamp="2025-12-04 15:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:25:06.941700332 +0000 UTC m=+314.376370189" watchObservedRunningTime="2025-12-04 15:25:06.975491427 +0000 UTC m=+314.410161274" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.976440 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml7rm"] Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.979641 4676 scope.go:117] "RemoveContainer" containerID="8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80" Dec 04 15:25:06 crc kubenswrapper[4676]: I1204 15:25:06.979995 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ml7rm"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.011927 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2brr7"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.014520 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2brr7"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.018958 4676 scope.go:117] "RemoveContainer" containerID="d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.019335 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4627g"] Dec 04 15:25:07 crc kubenswrapper[4676]: E1204 15:25:07.019365 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77\": container with ID starting with d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77 not found: ID does not exist" containerID="d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.019408 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77"} err="failed to get container status \"d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77\": rpc error: code = NotFound desc = could not find container \"d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77\": container with ID starting with d3395268661479df86d2b2a6facd547e4aba4b12af64c33532de8b6f099a0c77 not found: ID does not exist" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.019439 4676 scope.go:117] "RemoveContainer" containerID="cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1" Dec 04 15:25:07 crc kubenswrapper[4676]: E1204 15:25:07.019809 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1\": container with ID starting with cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1 not found: ID does not exist" containerID="cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.019849 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1"} err="failed to get container status \"cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1\": rpc error: code = NotFound desc = could not find container \"cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1\": container with ID starting with cfd88e0054983f618234b201e8892d866f71905ef6ffb04215abdc4306de8ce1 not found: ID does not exist" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.019880 4676 scope.go:117] "RemoveContainer" containerID="8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80" Dec 04 15:25:07 crc kubenswrapper[4676]: E1204 15:25:07.020297 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80\": container with ID starting with 8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80 not found: ID does not exist" containerID="8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.020333 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80"} err="failed to get container status \"8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80\": rpc error: code = NotFound desc = could not find container \"8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80\": container with ID starting with 8144a6faf43eb89e4a749711f81f66fdafe7861b198de4b8311820185ddc2c80 not found: ID does not exist" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.020355 4676 scope.go:117] "RemoveContainer" containerID="88443358f044707098a6957bb486390af8265efb28696463f962f7bd7cffa00b" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.023193 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4627g"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.032496 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tx6hs"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.037341 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tx6hs"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.040344 4676 scope.go:117] "RemoveContainer" containerID="9e351689f2d75d62635efced3236a21ddc32897aa607954647fdad7d26cb2408" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.049012 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zd784"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.053273 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zd784"] Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.057744 4676 scope.go:117] "RemoveContainer" containerID="5ea13300ee3dc3a117c4795d066fdf9a06abed26bd2d8fe9e5eac05c915caabd" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.072762 4676 scope.go:117] "RemoveContainer" containerID="4b0687231ef46f1df1ea3301976b5482d48f8ffea2f118b12df9738514bf5a3a" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.090147 4676 scope.go:117] "RemoveContainer" containerID="f6f46434e5c90eea329a9cbe61386e51ae33c5cd53ef7a27a95fa3359b018b16" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.107012 4676 scope.go:117] "RemoveContainer" containerID="ab4fd5783279cead129241dceb4ace52911cf8e650645bb02fe3dff611031b37" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.128707 4676 scope.go:117] "RemoveContainer" containerID="6de33229affe6b4b49e743d867488a601736f34be02f2a63e7419605b9e577c1" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.190256 4676 scope.go:117] "RemoveContainer" containerID="776decc324e4c9692ffb753724e2d065ae05bbf134d25d0e9d8887210b226df0" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.205183 4676 scope.go:117] "RemoveContainer" containerID="cff388542e0445c8cb02d77f4f3ebff3af17e3797020efa9358bae35a959e883" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.221036 4676 scope.go:117] "RemoveContainer" containerID="3cb7423829b9d7255b8a038092e56073f245b18cde414f9b93a141f9f06e8eb9" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.391043 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" path="/var/lib/kubelet/pods/009171f0-c033-4ea6-b46d-0155fe9f3e71/volumes" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.392379 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" path="/var/lib/kubelet/pods/131c312c-f19d-4e87-8f86-8d38926b2d87/volumes" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.393256 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa95312-1f71-4167-9982-352d67b49f03" path="/var/lib/kubelet/pods/1aa95312-1f71-4167-9982-352d67b49f03/volumes" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.394400 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" path="/var/lib/kubelet/pods/8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d/volumes" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.395451 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a945f156-c10a-4132-8fb4-e43040790a01" path="/var/lib/kubelet/pods/a945f156-c10a-4132-8fb4-e43040790a01/volumes" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.535249 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.957662 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 15:25:07 crc kubenswrapper[4676]: I1204 15:25:07.957728 4676 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="1abf37bb2c7b0b1bf11ac3886352b968c5414ffed36c5f6d20ccdd2a439eba83" exitCode=137 Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.236619 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.236702 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383295 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383356 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383392 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383445 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383473 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383741 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383741 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383773 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.383805 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.395651 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.485030 4676 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.485188 4676 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.485274 4676 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.485338 4676 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.485354 4676 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.965563 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.965707 4676 scope.go:117] "RemoveContainer" containerID="1abf37bb2c7b0b1bf11ac3886352b968c5414ffed36c5f6d20ccdd2a439eba83" Dec 04 15:25:08 crc kubenswrapper[4676]: I1204 15:25:08.965722 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 04 15:25:09 crc kubenswrapper[4676]: I1204 15:25:09.392053 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 04 15:25:09 crc kubenswrapper[4676]: I1204 15:25:09.393444 4676 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 04 15:25:09 crc kubenswrapper[4676]: I1204 15:25:09.405795 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 15:25:09 crc kubenswrapper[4676]: I1204 15:25:09.405851 4676 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1b8793ef-d6a7-4059-b8ca-30ee74954cbd" Dec 04 15:25:09 crc kubenswrapper[4676]: I1204 15:25:09.409185 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 04 15:25:09 crc kubenswrapper[4676]: I1204 15:25:09.409238 4676 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="1b8793ef-d6a7-4059-b8ca-30ee74954cbd" Dec 04 15:25:10 crc kubenswrapper[4676]: I1204 15:25:10.384815 4676 scope.go:117] "RemoveContainer" containerID="86e788d277c34163289c7b3e9a64e32ddd49550270151663fc28a682c7cbbb9a" Dec 04 15:25:10 crc kubenswrapper[4676]: I1204 15:25:10.979558 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Dec 04 15:25:10 crc kubenswrapper[4676]: I1204 15:25:10.979883 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c6f8b4b0c83c5ddf5f9890ffdf2c452c97a699715d1e981dd62c729e78fe6994"} Dec 04 15:25:36 crc kubenswrapper[4676]: I1204 15:25:36.863871 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dlhc6"] Dec 04 15:25:36 crc kubenswrapper[4676]: I1204 15:25:36.864653 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" podUID="591b399c-21b2-4c6f-ab3a-d424df670c0b" containerName="controller-manager" containerID="cri-o://476c1d841b71b355a86d80c332a53ac94962c9f3bf87315f6d51bc4ed6f0dca2" gracePeriod=30 Dec 04 15:25:36 crc kubenswrapper[4676]: I1204 15:25:36.980256 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw"] Dec 04 15:25:36 crc kubenswrapper[4676]: I1204 15:25:36.980832 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" podUID="a735889f-51fc-49e1-8756-4f9dc2c05d94" containerName="route-controller-manager" containerID="cri-o://f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076" gracePeriod=30 Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.124423 4676 generic.go:334] "Generic (PLEG): container finished" podID="591b399c-21b2-4c6f-ab3a-d424df670c0b" containerID="476c1d841b71b355a86d80c332a53ac94962c9f3bf87315f6d51bc4ed6f0dca2" exitCode=0 Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.124480 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" event={"ID":"591b399c-21b2-4c6f-ab3a-d424df670c0b","Type":"ContainerDied","Data":"476c1d841b71b355a86d80c332a53ac94962c9f3bf87315f6d51bc4ed6f0dca2"} Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.234095 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.353824 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.400392 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8q7b\" (UniqueName: \"kubernetes.io/projected/591b399c-21b2-4c6f-ab3a-d424df670c0b-kube-api-access-t8q7b\") pod \"591b399c-21b2-4c6f-ab3a-d424df670c0b\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.400551 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/591b399c-21b2-4c6f-ab3a-d424df670c0b-serving-cert\") pod \"591b399c-21b2-4c6f-ab3a-d424df670c0b\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.400591 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-proxy-ca-bundles\") pod \"591b399c-21b2-4c6f-ab3a-d424df670c0b\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.400629 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-config\") pod \"591b399c-21b2-4c6f-ab3a-d424df670c0b\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.400663 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-client-ca\") pod \"591b399c-21b2-4c6f-ab3a-d424df670c0b\" (UID: \"591b399c-21b2-4c6f-ab3a-d424df670c0b\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.401850 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "591b399c-21b2-4c6f-ab3a-d424df670c0b" (UID: "591b399c-21b2-4c6f-ab3a-d424df670c0b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.402232 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-client-ca" (OuterVolumeSpecName: "client-ca") pod "591b399c-21b2-4c6f-ab3a-d424df670c0b" (UID: "591b399c-21b2-4c6f-ab3a-d424df670c0b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.402823 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-config" (OuterVolumeSpecName: "config") pod "591b399c-21b2-4c6f-ab3a-d424df670c0b" (UID: "591b399c-21b2-4c6f-ab3a-d424df670c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.407700 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591b399c-21b2-4c6f-ab3a-d424df670c0b-kube-api-access-t8q7b" (OuterVolumeSpecName: "kube-api-access-t8q7b") pod "591b399c-21b2-4c6f-ab3a-d424df670c0b" (UID: "591b399c-21b2-4c6f-ab3a-d424df670c0b"). InnerVolumeSpecName "kube-api-access-t8q7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.407981 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/591b399c-21b2-4c6f-ab3a-d424df670c0b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "591b399c-21b2-4c6f-ab3a-d424df670c0b" (UID: "591b399c-21b2-4c6f-ab3a-d424df670c0b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502390 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57fzv\" (UniqueName: \"kubernetes.io/projected/a735889f-51fc-49e1-8756-4f9dc2c05d94-kube-api-access-57fzv\") pod \"a735889f-51fc-49e1-8756-4f9dc2c05d94\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502496 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-client-ca\") pod \"a735889f-51fc-49e1-8756-4f9dc2c05d94\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502529 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a735889f-51fc-49e1-8756-4f9dc2c05d94-serving-cert\") pod \"a735889f-51fc-49e1-8756-4f9dc2c05d94\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502564 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-config\") pod \"a735889f-51fc-49e1-8756-4f9dc2c05d94\" (UID: \"a735889f-51fc-49e1-8756-4f9dc2c05d94\") " Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502830 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8q7b\" (UniqueName: \"kubernetes.io/projected/591b399c-21b2-4c6f-ab3a-d424df670c0b-kube-api-access-t8q7b\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502843 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/591b399c-21b2-4c6f-ab3a-d424df670c0b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502852 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502862 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.502871 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/591b399c-21b2-4c6f-ab3a-d424df670c0b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.503585 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-client-ca" (OuterVolumeSpecName: "client-ca") pod "a735889f-51fc-49e1-8756-4f9dc2c05d94" (UID: "a735889f-51fc-49e1-8756-4f9dc2c05d94"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.503837 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-config" (OuterVolumeSpecName: "config") pod "a735889f-51fc-49e1-8756-4f9dc2c05d94" (UID: "a735889f-51fc-49e1-8756-4f9dc2c05d94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.506363 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a735889f-51fc-49e1-8756-4f9dc2c05d94-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a735889f-51fc-49e1-8756-4f9dc2c05d94" (UID: "a735889f-51fc-49e1-8756-4f9dc2c05d94"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.506482 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a735889f-51fc-49e1-8756-4f9dc2c05d94-kube-api-access-57fzv" (OuterVolumeSpecName: "kube-api-access-57fzv") pod "a735889f-51fc-49e1-8756-4f9dc2c05d94" (UID: "a735889f-51fc-49e1-8756-4f9dc2c05d94"). InnerVolumeSpecName "kube-api-access-57fzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.604492 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a735889f-51fc-49e1-8756-4f9dc2c05d94-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.604545 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.604555 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57fzv\" (UniqueName: \"kubernetes.io/projected/a735889f-51fc-49e1-8756-4f9dc2c05d94-kube-api-access-57fzv\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:37 crc kubenswrapper[4676]: I1204 15:25:37.604568 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a735889f-51fc-49e1-8756-4f9dc2c05d94-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.130386 4676 generic.go:334] "Generic (PLEG): container finished" podID="a735889f-51fc-49e1-8756-4f9dc2c05d94" containerID="f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076" exitCode=0 Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.130465 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" event={"ID":"a735889f-51fc-49e1-8756-4f9dc2c05d94","Type":"ContainerDied","Data":"f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076"} Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.130494 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" event={"ID":"a735889f-51fc-49e1-8756-4f9dc2c05d94","Type":"ContainerDied","Data":"50493e69647a3cf0ad71e43163442a5f7155134f6e94b53bee84bced380052c7"} Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.130513 4676 scope.go:117] "RemoveContainer" containerID="f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.131265 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.132643 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" event={"ID":"591b399c-21b2-4c6f-ab3a-d424df670c0b","Type":"ContainerDied","Data":"52bf81443f7bd00b4e502eb20eb76338c3efba6f8e1ec377fdb8a221641e77bd"} Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.132746 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dlhc6" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.148653 4676 scope.go:117] "RemoveContainer" containerID="f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.149606 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076\": container with ID starting with f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076 not found: ID does not exist" containerID="f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.149684 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076"} err="failed to get container status \"f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076\": rpc error: code = NotFound desc = could not find container \"f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076\": container with ID starting with f6af2d196bfff8717edfac93c68d59a7b69bbfa008ae6f3709b0ba72891d7076 not found: ID does not exist" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.149719 4676 scope.go:117] "RemoveContainer" containerID="476c1d841b71b355a86d80c332a53ac94962c9f3bf87315f6d51bc4ed6f0dca2" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.163834 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dlhc6"] Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.167174 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dlhc6"] Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.181458 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw"] Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.184209 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-w9pnw"] Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.674647 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7547f884bc-m7tv8"] Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.674951 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.674971 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.674987 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.674995 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675007 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675016 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675027 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675035 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675046 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" containerName="marketplace-operator" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675054 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" containerName="marketplace-operator" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675063 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675071 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675082 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675090 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675102 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675110 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675123 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675131 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675143 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591b399c-21b2-4c6f-ab3a-d424df670c0b" containerName="controller-manager" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675150 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="591b399c-21b2-4c6f-ab3a-d424df670c0b" containerName="controller-manager" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675165 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675172 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675180 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a735889f-51fc-49e1-8756-4f9dc2c05d94" containerName="route-controller-manager" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675187 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a735889f-51fc-49e1-8756-4f9dc2c05d94" containerName="route-controller-manager" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675197 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675205 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="extract-utilities" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675214 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675221 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="extract-content" Dec 04 15:25:38 crc kubenswrapper[4676]: E1204 15:25:38.675231 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675240 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675362 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="131c312c-f19d-4e87-8f86-8d38926b2d87" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675380 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a735889f-51fc-49e1-8756-4f9dc2c05d94" containerName="route-controller-manager" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675393 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bbd5fda-37c7-49d7-b5b2-fa9ce62e5f9d" containerName="marketplace-operator" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675404 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a945f156-c10a-4132-8fb4-e43040790a01" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675413 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="591b399c-21b2-4c6f-ab3a-d424df670c0b" containerName="controller-manager" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675423 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="009171f0-c033-4ea6-b46d-0155fe9f3e71" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675430 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa95312-1f71-4167-9982-352d67b49f03" containerName="registry-server" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.675891 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.678678 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.678737 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.679107 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.680387 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.680527 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.687787 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.689780 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7547f884bc-m7tv8"] Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.692513 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.696366 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7"] Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.697230 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.699810 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.704136 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.704397 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.704411 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.704444 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.704564 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.719340 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7"] Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817435 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mr2\" (UniqueName: \"kubernetes.io/projected/c7cf4a76-500b-451e-89c5-d80def11bbbd-kube-api-access-22mr2\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817530 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrjn\" (UniqueName: \"kubernetes.io/projected/4b19582d-2df2-45bc-9214-366717e5361e-kube-api-access-lzrjn\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817581 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b19582d-2df2-45bc-9214-366717e5361e-serving-cert\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817650 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-proxy-ca-bundles\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817671 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-config\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817694 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf4a76-500b-451e-89c5-d80def11bbbd-serving-cert\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817707 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-client-ca\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817735 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-config\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.817760 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-client-ca\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.919669 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-config\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.919770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-client-ca\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.919812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22mr2\" (UniqueName: \"kubernetes.io/projected/c7cf4a76-500b-451e-89c5-d80def11bbbd-kube-api-access-22mr2\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.919860 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzrjn\" (UniqueName: \"kubernetes.io/projected/4b19582d-2df2-45bc-9214-366717e5361e-kube-api-access-lzrjn\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.919887 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b19582d-2df2-45bc-9214-366717e5361e-serving-cert\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.919953 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-proxy-ca-bundles\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.919976 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-config\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.920005 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf4a76-500b-451e-89c5-d80def11bbbd-serving-cert\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.920030 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-client-ca\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.921322 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-client-ca\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.921327 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-client-ca\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.922283 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-config\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.922554 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-config\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.923193 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-proxy-ca-bundles\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.924413 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b19582d-2df2-45bc-9214-366717e5361e-serving-cert\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.925792 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf4a76-500b-451e-89c5-d80def11bbbd-serving-cert\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.946599 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mr2\" (UniqueName: \"kubernetes.io/projected/c7cf4a76-500b-451e-89c5-d80def11bbbd-kube-api-access-22mr2\") pod \"controller-manager-7547f884bc-m7tv8\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.948143 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzrjn\" (UniqueName: \"kubernetes.io/projected/4b19582d-2df2-45bc-9214-366717e5361e-kube-api-access-lzrjn\") pod \"route-controller-manager-579bf77ccf-66sp7\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:38 crc kubenswrapper[4676]: I1204 15:25:38.990969 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:39 crc kubenswrapper[4676]: I1204 15:25:39.020519 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:39 crc kubenswrapper[4676]: I1204 15:25:39.211426 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7547f884bc-m7tv8"] Dec 04 15:25:39 crc kubenswrapper[4676]: I1204 15:25:39.280790 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7"] Dec 04 15:25:39 crc kubenswrapper[4676]: W1204 15:25:39.289475 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b19582d_2df2_45bc_9214_366717e5361e.slice/crio-68080945b2b6fba0952aa5b66d911e744609f5b9f650f640c8bb14d4a9dcb744 WatchSource:0}: Error finding container 68080945b2b6fba0952aa5b66d911e744609f5b9f650f640c8bb14d4a9dcb744: Status 404 returned error can't find the container with id 68080945b2b6fba0952aa5b66d911e744609f5b9f650f640c8bb14d4a9dcb744 Dec 04 15:25:39 crc kubenswrapper[4676]: I1204 15:25:39.391561 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591b399c-21b2-4c6f-ab3a-d424df670c0b" path="/var/lib/kubelet/pods/591b399c-21b2-4c6f-ab3a-d424df670c0b/volumes" Dec 04 15:25:39 crc kubenswrapper[4676]: I1204 15:25:39.392518 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a735889f-51fc-49e1-8756-4f9dc2c05d94" path="/var/lib/kubelet/pods/a735889f-51fc-49e1-8756-4f9dc2c05d94/volumes" Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.154184 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" event={"ID":"4b19582d-2df2-45bc-9214-366717e5361e","Type":"ContainerStarted","Data":"47209d3254d56bfc2ac85cfd195a7c87490bf51f05b4d9b5f746989b3a3ee5aa"} Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.154247 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" event={"ID":"4b19582d-2df2-45bc-9214-366717e5361e","Type":"ContainerStarted","Data":"68080945b2b6fba0952aa5b66d911e744609f5b9f650f640c8bb14d4a9dcb744"} Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.154580 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.156017 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" event={"ID":"c7cf4a76-500b-451e-89c5-d80def11bbbd","Type":"ContainerStarted","Data":"c92e8eb931397db570b30aa3b106e71e30d2be45796cdafa5224ee4ca9eb59bb"} Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.156090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" event={"ID":"c7cf4a76-500b-451e-89c5-d80def11bbbd","Type":"ContainerStarted","Data":"c286dc1c0637a7994dbd7111a76c2c3289b53b997d4991b78d85efb6a3e16b34"} Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.156279 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.165174 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.167071 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.185577 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" podStartSLOduration=2.185552518 podStartE2EDuration="2.185552518s" podCreationTimestamp="2025-12-04 15:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:25:40.176448424 +0000 UTC m=+347.611118281" watchObservedRunningTime="2025-12-04 15:25:40.185552518 +0000 UTC m=+347.620222375" Dec 04 15:25:40 crc kubenswrapper[4676]: I1204 15:25:40.215580 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" podStartSLOduration=2.215553857 podStartE2EDuration="2.215553857s" podCreationTimestamp="2025-12-04 15:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:25:40.212764477 +0000 UTC m=+347.647434344" watchObservedRunningTime="2025-12-04 15:25:40.215553857 +0000 UTC m=+347.650223714" Dec 04 15:25:46 crc kubenswrapper[4676]: I1204 15:25:46.026740 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:25:46 crc kubenswrapper[4676]: I1204 15:25:46.027407 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.205772 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b6wkg"] Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.207682 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.210397 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.217136 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6wkg"] Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.273987 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-utilities\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.274039 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-catalog-content\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.274076 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd2lf\" (UniqueName: \"kubernetes.io/projected/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-kube-api-access-gd2lf\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.375402 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-utilities\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.375455 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-catalog-content\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.375481 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd2lf\" (UniqueName: \"kubernetes.io/projected/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-kube-api-access-gd2lf\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.376330 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-utilities\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.376367 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-catalog-content\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.390970 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mb2t7"] Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.392232 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.394050 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.404185 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mb2t7"] Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.405192 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd2lf\" (UniqueName: \"kubernetes.io/projected/b7ad7e78-7f85-4c56-8aa9-12aeef76c043-kube-api-access-gd2lf\") pod \"redhat-marketplace-b6wkg\" (UID: \"b7ad7e78-7f85-4c56-8aa9-12aeef76c043\") " pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.476381 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e655c075-09f9-4409-a370-0acced242279-utilities\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.476765 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e655c075-09f9-4409-a370-0acced242279-catalog-content\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.476973 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2s76\" (UniqueName: \"kubernetes.io/projected/e655c075-09f9-4409-a370-0acced242279-kube-api-access-k2s76\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.527319 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.579017 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2s76\" (UniqueName: \"kubernetes.io/projected/e655c075-09f9-4409-a370-0acced242279-kube-api-access-k2s76\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.579099 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e655c075-09f9-4409-a370-0acced242279-utilities\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.579144 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e655c075-09f9-4409-a370-0acced242279-catalog-content\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.580574 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e655c075-09f9-4409-a370-0acced242279-catalog-content\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:58 crc kubenswrapper[4676]: I1204 15:25:58.821003 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e655c075-09f9-4409-a370-0acced242279-utilities\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:59 crc kubenswrapper[4676]: I1204 15:25:59.114682 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2s76\" (UniqueName: \"kubernetes.io/projected/e655c075-09f9-4409-a370-0acced242279-kube-api-access-k2s76\") pod \"community-operators-mb2t7\" (UID: \"e655c075-09f9-4409-a370-0acced242279\") " pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:59 crc kubenswrapper[4676]: I1204 15:25:59.326877 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:25:59 crc kubenswrapper[4676]: I1204 15:25:59.463702 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6wkg"] Dec 04 15:25:59 crc kubenswrapper[4676]: I1204 15:25:59.737611 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mb2t7"] Dec 04 15:25:59 crc kubenswrapper[4676]: W1204 15:25:59.751733 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode655c075_09f9_4409_a370_0acced242279.slice/crio-046d9f2f976740b3e04d2bf529c4e496c57f14d15779bd912574588d1e2747ba WatchSource:0}: Error finding container 046d9f2f976740b3e04d2bf529c4e496c57f14d15779bd912574588d1e2747ba: Status 404 returned error can't find the container with id 046d9f2f976740b3e04d2bf529c4e496c57f14d15779bd912574588d1e2747ba Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.454277 4676 generic.go:334] "Generic (PLEG): container finished" podID="b7ad7e78-7f85-4c56-8aa9-12aeef76c043" containerID="bab96be86bb27cf30cf0d2890d510bb23ce366e3c071280bff4a80ca8d803fab" exitCode=0 Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.454378 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6wkg" event={"ID":"b7ad7e78-7f85-4c56-8aa9-12aeef76c043","Type":"ContainerDied","Data":"bab96be86bb27cf30cf0d2890d510bb23ce366e3c071280bff4a80ca8d803fab"} Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.454701 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6wkg" event={"ID":"b7ad7e78-7f85-4c56-8aa9-12aeef76c043","Type":"ContainerStarted","Data":"0151f0129a5c3d976e9de7d7d9744f23081e8157e84328e7b920f33cf0d99772"} Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.456176 4676 generic.go:334] "Generic (PLEG): container finished" podID="e655c075-09f9-4409-a370-0acced242279" containerID="abd32dd70639421200e9f52950827ab1d8fc1b60354b4caa73a8fb3b5f8b79b9" exitCode=0 Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.456224 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb2t7" event={"ID":"e655c075-09f9-4409-a370-0acced242279","Type":"ContainerDied","Data":"abd32dd70639421200e9f52950827ab1d8fc1b60354b4caa73a8fb3b5f8b79b9"} Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.456261 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb2t7" event={"ID":"e655c075-09f9-4409-a370-0acced242279","Type":"ContainerStarted","Data":"046d9f2f976740b3e04d2bf529c4e496c57f14d15779bd912574588d1e2747ba"} Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.596711 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8dk8v"] Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.598159 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.599943 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.611052 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dk8v"] Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.750410 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-catalog-content\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.750488 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-utilities\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.750520 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2k5l\" (UniqueName: \"kubernetes.io/projected/1d91ab6c-0b23-464e-a8d3-5be12c97971e-kube-api-access-h2k5l\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.794049 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gdnsz"] Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.795220 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.797445 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.808663 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdnsz"] Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.851881 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-catalog-content\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.852159 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-utilities\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.852225 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2k5l\" (UniqueName: \"kubernetes.io/projected/1d91ab6c-0b23-464e-a8d3-5be12c97971e-kube-api-access-h2k5l\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.852821 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-catalog-content\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.852862 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-utilities\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.872575 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2k5l\" (UniqueName: \"kubernetes.io/projected/1d91ab6c-0b23-464e-a8d3-5be12c97971e-kube-api-access-h2k5l\") pod \"redhat-operators-8dk8v\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.913171 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.953253 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wq2v\" (UniqueName: \"kubernetes.io/projected/aebba73c-4263-4e22-a922-de02e092f260-kube-api-access-5wq2v\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.953566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-catalog-content\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:00 crc kubenswrapper[4676]: I1204 15:26:00.953726 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-utilities\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.092938 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-utilities\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.093130 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wq2v\" (UniqueName: \"kubernetes.io/projected/aebba73c-4263-4e22-a922-de02e092f260-kube-api-access-5wq2v\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.093507 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-catalog-content\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.093661 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-utilities\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.093996 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-catalog-content\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.125738 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wq2v\" (UniqueName: \"kubernetes.io/projected/aebba73c-4263-4e22-a922-de02e092f260-kube-api-access-5wq2v\") pod \"certified-operators-gdnsz\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.492568 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.513492 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6wkg" event={"ID":"b7ad7e78-7f85-4c56-8aa9-12aeef76c043","Type":"ContainerStarted","Data":"a50d9cb945d30166fcc072299b6131d50ca72d0807842df418538f6ef3ce7460"} Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.543473 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dk8v"] Dec 04 15:26:01 crc kubenswrapper[4676]: I1204 15:26:01.942220 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gdnsz"] Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.527622 4676 generic.go:334] "Generic (PLEG): container finished" podID="aebba73c-4263-4e22-a922-de02e092f260" containerID="8c5205e31092b924aa4e81ff1395c807de46a3bf4622b47fbba7a1627e466418" exitCode=0 Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.527720 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdnsz" event={"ID":"aebba73c-4263-4e22-a922-de02e092f260","Type":"ContainerDied","Data":"8c5205e31092b924aa4e81ff1395c807de46a3bf4622b47fbba7a1627e466418"} Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.527756 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdnsz" event={"ID":"aebba73c-4263-4e22-a922-de02e092f260","Type":"ContainerStarted","Data":"f928b2470c100f0520746383020a9dcf2e8bbee65b417990b8664841ac08d6a7"} Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.529856 4676 generic.go:334] "Generic (PLEG): container finished" podID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerID="30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668" exitCode=0 Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.529970 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dk8v" event={"ID":"1d91ab6c-0b23-464e-a8d3-5be12c97971e","Type":"ContainerDied","Data":"30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668"} Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.530017 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dk8v" event={"ID":"1d91ab6c-0b23-464e-a8d3-5be12c97971e","Type":"ContainerStarted","Data":"a4a1de6848a5a846ffd7defb17d2717fb95d501f2496521b5cba44bcf64224a8"} Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.533423 4676 generic.go:334] "Generic (PLEG): container finished" podID="b7ad7e78-7f85-4c56-8aa9-12aeef76c043" containerID="a50d9cb945d30166fcc072299b6131d50ca72d0807842df418538f6ef3ce7460" exitCode=0 Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.533498 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6wkg" event={"ID":"b7ad7e78-7f85-4c56-8aa9-12aeef76c043","Type":"ContainerDied","Data":"a50d9cb945d30166fcc072299b6131d50ca72d0807842df418538f6ef3ce7460"} Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.535468 4676 generic.go:334] "Generic (PLEG): container finished" podID="e655c075-09f9-4409-a370-0acced242279" containerID="75332f016c486ce493b7ce7d5f5854aff997f49b0ec71126aa0c146324f639cb" exitCode=0 Dec 04 15:26:02 crc kubenswrapper[4676]: I1204 15:26:02.535504 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb2t7" event={"ID":"e655c075-09f9-4409-a370-0acced242279","Type":"ContainerDied","Data":"75332f016c486ce493b7ce7d5f5854aff997f49b0ec71126aa0c146324f639cb"} Dec 04 15:26:03 crc kubenswrapper[4676]: I1204 15:26:03.606744 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6wkg" event={"ID":"b7ad7e78-7f85-4c56-8aa9-12aeef76c043","Type":"ContainerStarted","Data":"697a59e98d2fb7025d156ca2ec675b294824c55f08c81d7b30d8f400d9abe8dc"} Dec 04 15:26:03 crc kubenswrapper[4676]: I1204 15:26:03.617660 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mb2t7" event={"ID":"e655c075-09f9-4409-a370-0acced242279","Type":"ContainerStarted","Data":"5525564fcc20ecf3f02980c8aec832358082ca112bfa02dbb086e05f8c9301d2"} Dec 04 15:26:03 crc kubenswrapper[4676]: I1204 15:26:03.622820 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdnsz" event={"ID":"aebba73c-4263-4e22-a922-de02e092f260","Type":"ContainerStarted","Data":"431da6b8d0b69f4cc44f223523399a8024f11c4b3bcaae7e6d66304e181ca45f"} Dec 04 15:26:03 crc kubenswrapper[4676]: I1204 15:26:03.640395 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b6wkg" podStartSLOduration=3.13505627 podStartE2EDuration="5.640362192s" podCreationTimestamp="2025-12-04 15:25:58 +0000 UTC" firstStartedPulling="2025-12-04 15:26:00.45704997 +0000 UTC m=+367.891719827" lastFinishedPulling="2025-12-04 15:26:02.962355852 +0000 UTC m=+370.397025749" observedRunningTime="2025-12-04 15:26:03.635183585 +0000 UTC m=+371.069853452" watchObservedRunningTime="2025-12-04 15:26:03.640362192 +0000 UTC m=+371.075032059" Dec 04 15:26:04 crc kubenswrapper[4676]: I1204 15:26:04.630281 4676 generic.go:334] "Generic (PLEG): container finished" podID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerID="8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190" exitCode=0 Dec 04 15:26:04 crc kubenswrapper[4676]: I1204 15:26:04.630346 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dk8v" event={"ID":"1d91ab6c-0b23-464e-a8d3-5be12c97971e","Type":"ContainerDied","Data":"8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190"} Dec 04 15:26:04 crc kubenswrapper[4676]: I1204 15:26:04.633470 4676 generic.go:334] "Generic (PLEG): container finished" podID="aebba73c-4263-4e22-a922-de02e092f260" containerID="431da6b8d0b69f4cc44f223523399a8024f11c4b3bcaae7e6d66304e181ca45f" exitCode=0 Dec 04 15:26:04 crc kubenswrapper[4676]: I1204 15:26:04.633630 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdnsz" event={"ID":"aebba73c-4263-4e22-a922-de02e092f260","Type":"ContainerDied","Data":"431da6b8d0b69f4cc44f223523399a8024f11c4b3bcaae7e6d66304e181ca45f"} Dec 04 15:26:04 crc kubenswrapper[4676]: I1204 15:26:04.658968 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mb2t7" podStartSLOduration=4.185453644 podStartE2EDuration="6.658944489s" podCreationTimestamp="2025-12-04 15:25:58 +0000 UTC" firstStartedPulling="2025-12-04 15:26:00.458681752 +0000 UTC m=+367.893351609" lastFinishedPulling="2025-12-04 15:26:02.932172597 +0000 UTC m=+370.366842454" observedRunningTime="2025-12-04 15:26:03.68672037 +0000 UTC m=+371.121390227" watchObservedRunningTime="2025-12-04 15:26:04.658944489 +0000 UTC m=+372.093614346" Dec 04 15:26:05 crc kubenswrapper[4676]: I1204 15:26:05.656396 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdnsz" event={"ID":"aebba73c-4263-4e22-a922-de02e092f260","Type":"ContainerStarted","Data":"416ca805cb14fb557246da2a611333b84b335e440cd1a780d6e3d0633893b54e"} Dec 04 15:26:05 crc kubenswrapper[4676]: I1204 15:26:05.659391 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dk8v" event={"ID":"1d91ab6c-0b23-464e-a8d3-5be12c97971e","Type":"ContainerStarted","Data":"810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2"} Dec 04 15:26:05 crc kubenswrapper[4676]: I1204 15:26:05.698063 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8dk8v" podStartSLOduration=3.191371074 podStartE2EDuration="5.698040689s" podCreationTimestamp="2025-12-04 15:26:00 +0000 UTC" firstStartedPulling="2025-12-04 15:26:02.535678864 +0000 UTC m=+369.970348721" lastFinishedPulling="2025-12-04 15:26:05.042348479 +0000 UTC m=+372.477018336" observedRunningTime="2025-12-04 15:26:05.697839303 +0000 UTC m=+373.132509170" watchObservedRunningTime="2025-12-04 15:26:05.698040689 +0000 UTC m=+373.132710546" Dec 04 15:26:05 crc kubenswrapper[4676]: I1204 15:26:05.699547 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gdnsz" podStartSLOduration=3.090653928 podStartE2EDuration="5.699539087s" podCreationTimestamp="2025-12-04 15:26:00 +0000 UTC" firstStartedPulling="2025-12-04 15:26:02.529488014 +0000 UTC m=+369.964157871" lastFinishedPulling="2025-12-04 15:26:05.138373173 +0000 UTC m=+372.573043030" observedRunningTime="2025-12-04 15:26:05.679865362 +0000 UTC m=+373.114535219" watchObservedRunningTime="2025-12-04 15:26:05.699539087 +0000 UTC m=+373.134208944" Dec 04 15:26:08 crc kubenswrapper[4676]: I1204 15:26:08.527578 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:26:08 crc kubenswrapper[4676]: I1204 15:26:08.527632 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:26:08 crc kubenswrapper[4676]: I1204 15:26:08.571757 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:26:08 crc kubenswrapper[4676]: I1204 15:26:08.724587 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b6wkg" Dec 04 15:26:09 crc kubenswrapper[4676]: I1204 15:26:09.327459 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:26:09 crc kubenswrapper[4676]: I1204 15:26:09.327641 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:26:09 crc kubenswrapper[4676]: I1204 15:26:09.367771 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:26:10 crc kubenswrapper[4676]: I1204 15:26:10.022569 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mb2t7" Dec 04 15:26:10 crc kubenswrapper[4676]: I1204 15:26:10.913535 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:10 crc kubenswrapper[4676]: I1204 15:26:10.913612 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:11 crc kubenswrapper[4676]: I1204 15:26:11.493967 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:11 crc kubenswrapper[4676]: I1204 15:26:11.494269 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:11 crc kubenswrapper[4676]: I1204 15:26:11.535327 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:11 crc kubenswrapper[4676]: I1204 15:26:11.955250 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8dk8v" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="registry-server" probeResult="failure" output=< Dec 04 15:26:11 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 15:26:11 crc kubenswrapper[4676]: > Dec 04 15:26:12 crc kubenswrapper[4676]: I1204 15:26:12.017575 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.027377 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.027768 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.138285 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tpct2"] Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.139588 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.161049 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tpct2"] Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.334739 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53655427-30ac-4c29-a94b-2480b20f9697-trusted-ca\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.334800 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53655427-30ac-4c29-a94b-2480b20f9697-registry-certificates\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.335060 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53655427-30ac-4c29-a94b-2480b20f9697-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.335202 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-registry-tls\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.335268 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53655427-30ac-4c29-a94b-2480b20f9697-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.335337 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-bound-sa-token\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.335463 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr7kf\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-kube-api-access-nr7kf\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.335550 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.371428 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.438555 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53655427-30ac-4c29-a94b-2480b20f9697-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.438636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-bound-sa-token\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.438668 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nr7kf\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-kube-api-access-nr7kf\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.438704 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53655427-30ac-4c29-a94b-2480b20f9697-trusted-ca\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.438727 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53655427-30ac-4c29-a94b-2480b20f9697-registry-certificates\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.438762 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53655427-30ac-4c29-a94b-2480b20f9697-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.438799 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-registry-tls\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.439768 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53655427-30ac-4c29-a94b-2480b20f9697-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.440124 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53655427-30ac-4c29-a94b-2480b20f9697-trusted-ca\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.440521 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53655427-30ac-4c29-a94b-2480b20f9697-registry-certificates\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.444258 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53655427-30ac-4c29-a94b-2480b20f9697-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.444470 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-registry-tls\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.457784 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-bound-sa-token\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.458023 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr7kf\" (UniqueName: \"kubernetes.io/projected/53655427-30ac-4c29-a94b-2480b20f9697-kube-api-access-nr7kf\") pod \"image-registry-66df7c8f76-tpct2\" (UID: \"53655427-30ac-4c29-a94b-2480b20f9697\") " pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:16 crc kubenswrapper[4676]: I1204 15:26:16.758761 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:17 crc kubenswrapper[4676]: I1204 15:26:17.185626 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tpct2"] Dec 04 15:26:17 crc kubenswrapper[4676]: W1204 15:26:17.189560 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53655427_30ac_4c29_a94b_2480b20f9697.slice/crio-0da616be72d9f7060a99ad4f3a551fc08c9de21a3b55b322932c08b6e5495c41 WatchSource:0}: Error finding container 0da616be72d9f7060a99ad4f3a551fc08c9de21a3b55b322932c08b6e5495c41: Status 404 returned error can't find the container with id 0da616be72d9f7060a99ad4f3a551fc08c9de21a3b55b322932c08b6e5495c41 Dec 04 15:26:17 crc kubenswrapper[4676]: I1204 15:26:17.332337 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7547f884bc-m7tv8"] Dec 04 15:26:17 crc kubenswrapper[4676]: I1204 15:26:17.332957 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" podUID="c7cf4a76-500b-451e-89c5-d80def11bbbd" containerName="controller-manager" containerID="cri-o://c92e8eb931397db570b30aa3b106e71e30d2be45796cdafa5224ee4ca9eb59bb" gracePeriod=30 Dec 04 15:26:18 crc kubenswrapper[4676]: I1204 15:26:18.047304 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" event={"ID":"53655427-30ac-4c29-a94b-2480b20f9697","Type":"ContainerStarted","Data":"0da616be72d9f7060a99ad4f3a551fc08c9de21a3b55b322932c08b6e5495c41"} Dec 04 15:26:18 crc kubenswrapper[4676]: I1204 15:26:18.991745 4676 patch_prober.go:28] interesting pod/controller-manager-7547f884bc-m7tv8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 04 15:26:18 crc kubenswrapper[4676]: I1204 15:26:18.992085 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" podUID="c7cf4a76-500b-451e-89c5-d80def11bbbd" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 04 15:26:21 crc kubenswrapper[4676]: I1204 15:26:21.020083 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:21 crc kubenswrapper[4676]: I1204 15:26:21.062036 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.165472 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.202400 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-84c5c66664-d6nsn"] Dec 04 15:26:25 crc kubenswrapper[4676]: E1204 15:26:25.203271 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7cf4a76-500b-451e-89c5-d80def11bbbd" containerName="controller-manager" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.203302 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7cf4a76-500b-451e-89c5-d80def11bbbd" containerName="controller-manager" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.203510 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7cf4a76-500b-451e-89c5-d80def11bbbd" containerName="controller-manager" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.204010 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.217764 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84c5c66664-d6nsn"] Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.268623 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22mr2\" (UniqueName: \"kubernetes.io/projected/c7cf4a76-500b-451e-89c5-d80def11bbbd-kube-api-access-22mr2\") pod \"c7cf4a76-500b-451e-89c5-d80def11bbbd\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.268767 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-proxy-ca-bundles\") pod \"c7cf4a76-500b-451e-89c5-d80def11bbbd\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.269694 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c7cf4a76-500b-451e-89c5-d80def11bbbd" (UID: "c7cf4a76-500b-451e-89c5-d80def11bbbd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.269802 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf4a76-500b-451e-89c5-d80def11bbbd-serving-cert\") pod \"c7cf4a76-500b-451e-89c5-d80def11bbbd\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.270370 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-client-ca\") pod \"c7cf4a76-500b-451e-89c5-d80def11bbbd\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.270450 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-config\") pod \"c7cf4a76-500b-451e-89c5-d80def11bbbd\" (UID: \"c7cf4a76-500b-451e-89c5-d80def11bbbd\") " Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.270739 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-client-ca" (OuterVolumeSpecName: "client-ca") pod "c7cf4a76-500b-451e-89c5-d80def11bbbd" (UID: "c7cf4a76-500b-451e-89c5-d80def11bbbd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271096 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-config\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271143 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-client-ca\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271224 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-644hg\" (UniqueName: \"kubernetes.io/projected/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-kube-api-access-644hg\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271250 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-config" (OuterVolumeSpecName: "config") pod "c7cf4a76-500b-451e-89c5-d80def11bbbd" (UID: "c7cf4a76-500b-451e-89c5-d80def11bbbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271286 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-proxy-ca-bundles\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271438 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-serving-cert\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271550 4676 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271598 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.271612 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7cf4a76-500b-451e-89c5-d80def11bbbd-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.276060 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7cf4a76-500b-451e-89c5-d80def11bbbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c7cf4a76-500b-451e-89c5-d80def11bbbd" (UID: "c7cf4a76-500b-451e-89c5-d80def11bbbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.277171 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7cf4a76-500b-451e-89c5-d80def11bbbd-kube-api-access-22mr2" (OuterVolumeSpecName: "kube-api-access-22mr2") pod "c7cf4a76-500b-451e-89c5-d80def11bbbd" (UID: "c7cf4a76-500b-451e-89c5-d80def11bbbd"). InnerVolumeSpecName "kube-api-access-22mr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.372749 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-config\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.372800 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-client-ca\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.372842 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-644hg\" (UniqueName: \"kubernetes.io/projected/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-kube-api-access-644hg\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.372875 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-proxy-ca-bundles\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.372936 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-serving-cert\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.372984 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22mr2\" (UniqueName: \"kubernetes.io/projected/c7cf4a76-500b-451e-89c5-d80def11bbbd-kube-api-access-22mr2\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.373002 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7cf4a76-500b-451e-89c5-d80def11bbbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.374010 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-client-ca\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.374515 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-config\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.375276 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-proxy-ca-bundles\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.376566 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-serving-cert\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.392633 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-644hg\" (UniqueName: \"kubernetes.io/projected/5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d-kube-api-access-644hg\") pod \"controller-manager-84c5c66664-d6nsn\" (UID: \"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d\") " pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.520303 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.728144 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-84c5c66664-d6nsn"] Dec 04 15:26:25 crc kubenswrapper[4676]: W1204 15:26:25.740984 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5de8b3f3_b8ae_4fc5_b224_7e0d640bb78d.slice/crio-e0a47816966ff2a56f0879c048dd93024f1ffd6c028a4c23b23957cc06ab5e9d WatchSource:0}: Error finding container e0a47816966ff2a56f0879c048dd93024f1ffd6c028a4c23b23957cc06ab5e9d: Status 404 returned error can't find the container with id e0a47816966ff2a56f0879c048dd93024f1ffd6c028a4c23b23957cc06ab5e9d Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.802647 4676 generic.go:334] "Generic (PLEG): container finished" podID="c7cf4a76-500b-451e-89c5-d80def11bbbd" containerID="c92e8eb931397db570b30aa3b106e71e30d2be45796cdafa5224ee4ca9eb59bb" exitCode=0 Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.802706 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" event={"ID":"c7cf4a76-500b-451e-89c5-d80def11bbbd","Type":"ContainerDied","Data":"c92e8eb931397db570b30aa3b106e71e30d2be45796cdafa5224ee4ca9eb59bb"} Dec 04 15:26:25 crc kubenswrapper[4676]: I1204 15:26:25.802762 4676 scope.go:117] "RemoveContainer" containerID="c92e8eb931397db570b30aa3b106e71e30d2be45796cdafa5224ee4ca9eb59bb" Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.811032 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" event={"ID":"53655427-30ac-4c29-a94b-2480b20f9697","Type":"ContainerStarted","Data":"cad11d884a0cca1d294c6719d9b3acafc57c16f4ecb7238425d54e0751714cdb"} Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.811231 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.812302 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" event={"ID":"c7cf4a76-500b-451e-89c5-d80def11bbbd","Type":"ContainerDied","Data":"c286dc1c0637a7994dbd7111a76c2c3289b53b997d4991b78d85efb6a3e16b34"} Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.812366 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7547f884bc-m7tv8" Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.814290 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" event={"ID":"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d","Type":"ContainerStarted","Data":"9a3ac466bb59791dec35888a905359d313f655a93a86699c169ded902ca7f0bb"} Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.814346 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" event={"ID":"5de8b3f3-b8ae-4fc5-b224-7e0d640bb78d","Type":"ContainerStarted","Data":"e0a47816966ff2a56f0879c048dd93024f1ffd6c028a4c23b23957cc06ab5e9d"} Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.814557 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.823884 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.837891 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" podStartSLOduration=10.837844253 podStartE2EDuration="10.837844253s" podCreationTimestamp="2025-12-04 15:26:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:26:26.834276554 +0000 UTC m=+394.268946421" watchObservedRunningTime="2025-12-04 15:26:26.837844253 +0000 UTC m=+394.272514130" Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.852484 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7547f884bc-m7tv8"] Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.855755 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7547f884bc-m7tv8"] Dec 04 15:26:26 crc kubenswrapper[4676]: I1204 15:26:26.870685 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-84c5c66664-d6nsn" podStartSLOduration=9.870662935 podStartE2EDuration="9.870662935s" podCreationTimestamp="2025-12-04 15:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:26:26.868103697 +0000 UTC m=+394.302773554" watchObservedRunningTime="2025-12-04 15:26:26.870662935 +0000 UTC m=+394.305332792" Dec 04 15:26:27 crc kubenswrapper[4676]: I1204 15:26:27.391796 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7cf4a76-500b-451e-89c5-d80def11bbbd" path="/var/lib/kubelet/pods/c7cf4a76-500b-451e-89c5-d80def11bbbd/volumes" Dec 04 15:26:36 crc kubenswrapper[4676]: I1204 15:26:36.872567 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7"] Dec 04 15:26:36 crc kubenswrapper[4676]: I1204 15:26:36.873951 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" podUID="4b19582d-2df2-45bc-9214-366717e5361e" containerName="route-controller-manager" containerID="cri-o://47209d3254d56bfc2ac85cfd195a7c87490bf51f05b4d9b5f746989b3a3ee5aa" gracePeriod=30 Dec 04 15:26:37 crc kubenswrapper[4676]: I1204 15:26:37.874031 4676 generic.go:334] "Generic (PLEG): container finished" podID="4b19582d-2df2-45bc-9214-366717e5361e" containerID="47209d3254d56bfc2ac85cfd195a7c87490bf51f05b4d9b5f746989b3a3ee5aa" exitCode=0 Dec 04 15:26:37 crc kubenswrapper[4676]: I1204 15:26:37.874411 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" event={"ID":"4b19582d-2df2-45bc-9214-366717e5361e","Type":"ContainerDied","Data":"47209d3254d56bfc2ac85cfd195a7c87490bf51f05b4d9b5f746989b3a3ee5aa"} Dec 04 15:26:37 crc kubenswrapper[4676]: I1204 15:26:37.934689 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.116592 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b19582d-2df2-45bc-9214-366717e5361e-serving-cert\") pod \"4b19582d-2df2-45bc-9214-366717e5361e\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.116726 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzrjn\" (UniqueName: \"kubernetes.io/projected/4b19582d-2df2-45bc-9214-366717e5361e-kube-api-access-lzrjn\") pod \"4b19582d-2df2-45bc-9214-366717e5361e\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.116808 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-client-ca\") pod \"4b19582d-2df2-45bc-9214-366717e5361e\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.116836 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-config\") pod \"4b19582d-2df2-45bc-9214-366717e5361e\" (UID: \"4b19582d-2df2-45bc-9214-366717e5361e\") " Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.117866 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-config" (OuterVolumeSpecName: "config") pod "4b19582d-2df2-45bc-9214-366717e5361e" (UID: "4b19582d-2df2-45bc-9214-366717e5361e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.117874 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-client-ca" (OuterVolumeSpecName: "client-ca") pod "4b19582d-2df2-45bc-9214-366717e5361e" (UID: "4b19582d-2df2-45bc-9214-366717e5361e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.122559 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b19582d-2df2-45bc-9214-366717e5361e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4b19582d-2df2-45bc-9214-366717e5361e" (UID: "4b19582d-2df2-45bc-9214-366717e5361e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.123158 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b19582d-2df2-45bc-9214-366717e5361e-kube-api-access-lzrjn" (OuterVolumeSpecName: "kube-api-access-lzrjn") pod "4b19582d-2df2-45bc-9214-366717e5361e" (UID: "4b19582d-2df2-45bc-9214-366717e5361e"). InnerVolumeSpecName "kube-api-access-lzrjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.218181 4676 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.218229 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b19582d-2df2-45bc-9214-366717e5361e-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.218239 4676 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b19582d-2df2-45bc-9214-366717e5361e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.218250 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzrjn\" (UniqueName: \"kubernetes.io/projected/4b19582d-2df2-45bc-9214-366717e5361e-kube-api-access-lzrjn\") on node \"crc\" DevicePath \"\"" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.229412 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96"] Dec 04 15:26:38 crc kubenswrapper[4676]: E1204 15:26:38.230023 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b19582d-2df2-45bc-9214-366717e5361e" containerName="route-controller-manager" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.230136 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b19582d-2df2-45bc-9214-366717e5361e" containerName="route-controller-manager" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.230444 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b19582d-2df2-45bc-9214-366717e5361e" containerName="route-controller-manager" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.231153 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.246618 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96"] Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.420291 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g92q4\" (UniqueName: \"kubernetes.io/projected/bb65e703-c38b-4a12-afe1-4326dac77b72-kube-api-access-g92q4\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.420364 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb65e703-c38b-4a12-afe1-4326dac77b72-client-ca\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.420393 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb65e703-c38b-4a12-afe1-4326dac77b72-config\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.420421 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb65e703-c38b-4a12-afe1-4326dac77b72-serving-cert\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.522134 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb65e703-c38b-4a12-afe1-4326dac77b72-client-ca\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.522191 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb65e703-c38b-4a12-afe1-4326dac77b72-config\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.522225 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb65e703-c38b-4a12-afe1-4326dac77b72-serving-cert\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.522361 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g92q4\" (UniqueName: \"kubernetes.io/projected/bb65e703-c38b-4a12-afe1-4326dac77b72-kube-api-access-g92q4\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.523355 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bb65e703-c38b-4a12-afe1-4326dac77b72-client-ca\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.524815 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb65e703-c38b-4a12-afe1-4326dac77b72-config\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.526973 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb65e703-c38b-4a12-afe1-4326dac77b72-serving-cert\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.547514 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g92q4\" (UniqueName: \"kubernetes.io/projected/bb65e703-c38b-4a12-afe1-4326dac77b72-kube-api-access-g92q4\") pod \"route-controller-manager-76cd887b8-m5s96\" (UID: \"bb65e703-c38b-4a12-afe1-4326dac77b72\") " pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.549373 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.883185 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" event={"ID":"4b19582d-2df2-45bc-9214-366717e5361e","Type":"ContainerDied","Data":"68080945b2b6fba0952aa5b66d911e744609f5b9f650f640c8bb14d4a9dcb744"} Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.883658 4676 scope.go:117] "RemoveContainer" containerID="47209d3254d56bfc2ac85cfd195a7c87490bf51f05b4d9b5f746989b3a3ee5aa" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.883576 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7" Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.927126 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7"] Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.931312 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-579bf77ccf-66sp7"] Dec 04 15:26:38 crc kubenswrapper[4676]: I1204 15:26:38.994189 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96"] Dec 04 15:26:39 crc kubenswrapper[4676]: I1204 15:26:39.393516 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b19582d-2df2-45bc-9214-366717e5361e" path="/var/lib/kubelet/pods/4b19582d-2df2-45bc-9214-366717e5361e/volumes" Dec 04 15:26:39 crc kubenswrapper[4676]: I1204 15:26:39.891715 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" event={"ID":"bb65e703-c38b-4a12-afe1-4326dac77b72","Type":"ContainerStarted","Data":"563556855eba4e44f6c5656c4a30f7d116b6b9c0316ed5abe05ee349a0cb729d"} Dec 04 15:26:39 crc kubenswrapper[4676]: I1204 15:26:39.891865 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:39 crc kubenswrapper[4676]: I1204 15:26:39.891885 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" event={"ID":"bb65e703-c38b-4a12-afe1-4326dac77b72","Type":"ContainerStarted","Data":"92977b68d1b7b7c8ee58aa6a7127a0f7e83290c5440bc464128b29f6e08d6742"} Dec 04 15:26:39 crc kubenswrapper[4676]: I1204 15:26:39.897277 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" Dec 04 15:26:39 crc kubenswrapper[4676]: I1204 15:26:39.918887 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76cd887b8-m5s96" podStartSLOduration=2.9188543129999998 podStartE2EDuration="2.918854313s" podCreationTimestamp="2025-12-04 15:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:26:39.914406507 +0000 UTC m=+407.349076384" watchObservedRunningTime="2025-12-04 15:26:39.918854313 +0000 UTC m=+407.353524170" Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.027031 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.027618 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.028255 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.028983 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6e26c970b5d7b969724e4eca8ef33d05c52608b0c4a173cc79a32e81b4de40c2"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.029056 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://6e26c970b5d7b969724e4eca8ef33d05c52608b0c4a173cc79a32e81b4de40c2" gracePeriod=600 Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.764106 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tpct2" Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.833801 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lfwj6"] Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.940401 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="6e26c970b5d7b969724e4eca8ef33d05c52608b0c4a173cc79a32e81b4de40c2" exitCode=0 Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.940623 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"6e26c970b5d7b969724e4eca8ef33d05c52608b0c4a173cc79a32e81b4de40c2"} Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.940833 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"bf1d12a652493590b1041f80cc7bc50696338309137f793248c1e4079ace37ed"} Dec 04 15:26:46 crc kubenswrapper[4676]: I1204 15:26:46.940891 4676 scope.go:117] "RemoveContainer" containerID="d62af8f96fa95afdc04bddc5815a67eed1856bc5780355f561c79174291831f8" Dec 04 15:27:11 crc kubenswrapper[4676]: I1204 15:27:11.878650 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" podUID="8742ff93-db20-4d4e-84fa-a9c4276643ea" containerName="registry" containerID="cri-o://11af11341549c246de2b757fb4020d95509901941c758904ef8024dbf003f204" gracePeriod=30 Dec 04 15:27:11 crc kubenswrapper[4676]: I1204 15:27:11.951278 4676 patch_prober.go:28] interesting pod/image-registry-697d97f7c8-lfwj6 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.31:5000/healthz\": dial tcp 10.217.0.31:5000: connect: connection refused" start-of-body= Dec 04 15:27:11 crc kubenswrapper[4676]: I1204 15:27:11.951397 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" podUID="8742ff93-db20-4d4e-84fa-a9c4276643ea" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.31:5000/healthz\": dial tcp 10.217.0.31:5000: connect: connection refused" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.084831 4676 generic.go:334] "Generic (PLEG): container finished" podID="8742ff93-db20-4d4e-84fa-a9c4276643ea" containerID="11af11341549c246de2b757fb4020d95509901941c758904ef8024dbf003f204" exitCode=0 Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.084984 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" event={"ID":"8742ff93-db20-4d4e-84fa-a9c4276643ea","Type":"ContainerDied","Data":"11af11341549c246de2b757fb4020d95509901941c758904ef8024dbf003f204"} Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.265716 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457550 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-tls\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457622 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-bound-sa-token\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457658 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-trusted-ca\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457681 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nmdm\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-kube-api-access-5nmdm\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457713 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8742ff93-db20-4d4e-84fa-a9c4276643ea-installation-pull-secrets\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457759 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8742ff93-db20-4d4e-84fa-a9c4276643ea-ca-trust-extracted\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457776 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-certificates\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.457951 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8742ff93-db20-4d4e-84fa-a9c4276643ea\" (UID: \"8742ff93-db20-4d4e-84fa-a9c4276643ea\") " Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.460568 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.460597 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.463864 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-kube-api-access-5nmdm" (OuterVolumeSpecName: "kube-api-access-5nmdm") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "kube-api-access-5nmdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.464309 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8742ff93-db20-4d4e-84fa-a9c4276643ea-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.464405 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.470404 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.470591 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.478543 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8742ff93-db20-4d4e-84fa-a9c4276643ea-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8742ff93-db20-4d4e-84fa-a9c4276643ea" (UID: "8742ff93-db20-4d4e-84fa-a9c4276643ea"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.559842 4676 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8742ff93-db20-4d4e-84fa-a9c4276643ea-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.559885 4676 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.559899 4676 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.559937 4676 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.559945 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8742ff93-db20-4d4e-84fa-a9c4276643ea-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.559953 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nmdm\" (UniqueName: \"kubernetes.io/projected/8742ff93-db20-4d4e-84fa-a9c4276643ea-kube-api-access-5nmdm\") on node \"crc\" DevicePath \"\"" Dec 04 15:27:12 crc kubenswrapper[4676]: I1204 15:27:12.559963 4676 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8742ff93-db20-4d4e-84fa-a9c4276643ea-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 04 15:27:13 crc kubenswrapper[4676]: I1204 15:27:13.093053 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" event={"ID":"8742ff93-db20-4d4e-84fa-a9c4276643ea","Type":"ContainerDied","Data":"967ffe1e29c10f2822cfc6aee466e5c675a067371a0ce40b0329affd7ed8bbca"} Dec 04 15:27:13 crc kubenswrapper[4676]: I1204 15:27:13.093165 4676 scope.go:117] "RemoveContainer" containerID="11af11341549c246de2b757fb4020d95509901941c758904ef8024dbf003f204" Dec 04 15:27:13 crc kubenswrapper[4676]: I1204 15:27:13.093972 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lfwj6" Dec 04 15:27:13 crc kubenswrapper[4676]: I1204 15:27:13.130093 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lfwj6"] Dec 04 15:27:13 crc kubenswrapper[4676]: I1204 15:27:13.132497 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lfwj6"] Dec 04 15:27:13 crc kubenswrapper[4676]: I1204 15:27:13.392225 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8742ff93-db20-4d4e-84fa-a9c4276643ea" path="/var/lib/kubelet/pods/8742ff93-db20-4d4e-84fa-a9c4276643ea/volumes" Dec 04 15:28:46 crc kubenswrapper[4676]: I1204 15:28:46.026779 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:28:46 crc kubenswrapper[4676]: I1204 15:28:46.027350 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:29:16 crc kubenswrapper[4676]: I1204 15:29:16.026796 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:29:16 crc kubenswrapper[4676]: I1204 15:29:16.027393 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.026683 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.027395 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.027501 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.028456 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf1d12a652493590b1041f80cc7bc50696338309137f793248c1e4079ace37ed"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.028551 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://bf1d12a652493590b1041f80cc7bc50696338309137f793248c1e4079ace37ed" gracePeriod=600 Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.903526 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="bf1d12a652493590b1041f80cc7bc50696338309137f793248c1e4079ace37ed" exitCode=0 Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.903589 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"bf1d12a652493590b1041f80cc7bc50696338309137f793248c1e4079ace37ed"} Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.904099 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"9fe7a265e00c1d56ac021f0f7b498108db8f42348e6b750a6c0468f9b25973a9"} Dec 04 15:29:46 crc kubenswrapper[4676]: I1204 15:29:46.904168 4676 scope.go:117] "RemoveContainer" containerID="6e26c970b5d7b969724e4eca8ef33d05c52608b0c4a173cc79a32e81b4de40c2" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.219746 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r"] Dec 04 15:30:00 crc kubenswrapper[4676]: E1204 15:30:00.221444 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8742ff93-db20-4d4e-84fa-a9c4276643ea" containerName="registry" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.221539 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8742ff93-db20-4d4e-84fa-a9c4276643ea" containerName="registry" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.221766 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8742ff93-db20-4d4e-84fa-a9c4276643ea" containerName="registry" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.222456 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.227503 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.227502 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.248886 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r"] Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.389587 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b70fed03-9c92-403c-9f63-732c2aeb0fd6-config-volume\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.389697 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6lk\" (UniqueName: \"kubernetes.io/projected/b70fed03-9c92-403c-9f63-732c2aeb0fd6-kube-api-access-5c6lk\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.389762 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b70fed03-9c92-403c-9f63-732c2aeb0fd6-secret-volume\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.490416 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b70fed03-9c92-403c-9f63-732c2aeb0fd6-secret-volume\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.490469 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b70fed03-9c92-403c-9f63-732c2aeb0fd6-config-volume\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.490530 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6lk\" (UniqueName: \"kubernetes.io/projected/b70fed03-9c92-403c-9f63-732c2aeb0fd6-kube-api-access-5c6lk\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.491883 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b70fed03-9c92-403c-9f63-732c2aeb0fd6-config-volume\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.496511 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b70fed03-9c92-403c-9f63-732c2aeb0fd6-secret-volume\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.507282 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6lk\" (UniqueName: \"kubernetes.io/projected/b70fed03-9c92-403c-9f63-732c2aeb0fd6-kube-api-access-5c6lk\") pod \"collect-profiles-29414370-pfx7r\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.542055 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:00 crc kubenswrapper[4676]: I1204 15:30:00.765032 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r"] Dec 04 15:30:01 crc kubenswrapper[4676]: I1204 15:30:01.076104 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" event={"ID":"b70fed03-9c92-403c-9f63-732c2aeb0fd6","Type":"ContainerStarted","Data":"8b22d30475a9fd360280e23a4d36e904846b1d07e6dd241e5c093771aef99b6d"} Dec 04 15:30:01 crc kubenswrapper[4676]: I1204 15:30:01.076166 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" event={"ID":"b70fed03-9c92-403c-9f63-732c2aeb0fd6","Type":"ContainerStarted","Data":"125abfde9a98b3d51437498fec61cddfc0d1f2333c9373450147cdac8e777bca"} Dec 04 15:30:01 crc kubenswrapper[4676]: I1204 15:30:01.100440 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" podStartSLOduration=1.100205683 podStartE2EDuration="1.100205683s" podCreationTimestamp="2025-12-04 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:30:01.091999097 +0000 UTC m=+608.526668954" watchObservedRunningTime="2025-12-04 15:30:01.100205683 +0000 UTC m=+608.534875540" Dec 04 15:30:02 crc kubenswrapper[4676]: I1204 15:30:02.084348 4676 generic.go:334] "Generic (PLEG): container finished" podID="b70fed03-9c92-403c-9f63-732c2aeb0fd6" containerID="8b22d30475a9fd360280e23a4d36e904846b1d07e6dd241e5c093771aef99b6d" exitCode=0 Dec 04 15:30:02 crc kubenswrapper[4676]: I1204 15:30:02.084435 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" event={"ID":"b70fed03-9c92-403c-9f63-732c2aeb0fd6","Type":"ContainerDied","Data":"8b22d30475a9fd360280e23a4d36e904846b1d07e6dd241e5c093771aef99b6d"} Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.339742 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.515551 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b70fed03-9c92-403c-9f63-732c2aeb0fd6-secret-volume\") pod \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.515621 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c6lk\" (UniqueName: \"kubernetes.io/projected/b70fed03-9c92-403c-9f63-732c2aeb0fd6-kube-api-access-5c6lk\") pod \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.515673 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b70fed03-9c92-403c-9f63-732c2aeb0fd6-config-volume\") pod \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\" (UID: \"b70fed03-9c92-403c-9f63-732c2aeb0fd6\") " Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.516435 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b70fed03-9c92-403c-9f63-732c2aeb0fd6-config-volume" (OuterVolumeSpecName: "config-volume") pod "b70fed03-9c92-403c-9f63-732c2aeb0fd6" (UID: "b70fed03-9c92-403c-9f63-732c2aeb0fd6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.521746 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b70fed03-9c92-403c-9f63-732c2aeb0fd6-kube-api-access-5c6lk" (OuterVolumeSpecName: "kube-api-access-5c6lk") pod "b70fed03-9c92-403c-9f63-732c2aeb0fd6" (UID: "b70fed03-9c92-403c-9f63-732c2aeb0fd6"). InnerVolumeSpecName "kube-api-access-5c6lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.530400 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b70fed03-9c92-403c-9f63-732c2aeb0fd6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b70fed03-9c92-403c-9f63-732c2aeb0fd6" (UID: "b70fed03-9c92-403c-9f63-732c2aeb0fd6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.616737 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c6lk\" (UniqueName: \"kubernetes.io/projected/b70fed03-9c92-403c-9f63-732c2aeb0fd6-kube-api-access-5c6lk\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.616798 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b70fed03-9c92-403c-9f63-732c2aeb0fd6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:03 crc kubenswrapper[4676]: I1204 15:30:03.616819 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b70fed03-9c92-403c-9f63-732c2aeb0fd6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:04 crc kubenswrapper[4676]: I1204 15:30:04.121439 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" event={"ID":"b70fed03-9c92-403c-9f63-732c2aeb0fd6","Type":"ContainerDied","Data":"125abfde9a98b3d51437498fec61cddfc0d1f2333c9373450147cdac8e777bca"} Dec 04 15:30:04 crc kubenswrapper[4676]: I1204 15:30:04.121495 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="125abfde9a98b3d51437498fec61cddfc0d1f2333c9373450147cdac8e777bca" Dec 04 15:30:04 crc kubenswrapper[4676]: I1204 15:30:04.121554 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.243617 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kf2h4"] Dec 04 15:30:30 crc kubenswrapper[4676]: E1204 15:30:30.244517 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b70fed03-9c92-403c-9f63-732c2aeb0fd6" containerName="collect-profiles" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.244537 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b70fed03-9c92-403c-9f63-732c2aeb0fd6" containerName="collect-profiles" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.244691 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b70fed03-9c92-403c-9f63-732c2aeb0fd6" containerName="collect-profiles" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.245223 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.247505 4676 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-jxrm8" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.248032 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.253426 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.255564 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kf2h4"] Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.268803 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-ts58n"] Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.269777 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-ts58n" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.271460 4676 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-58ppm" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.276392 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-stttj"] Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.280962 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.282492 4676 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-ftpnt" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.293425 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-stttj"] Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.321821 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-ts58n"] Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.404131 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmb5t\" (UniqueName: \"kubernetes.io/projected/b3a1fea5-f2ce-4047-b055-35cdaadd95c2-kube-api-access-gmb5t\") pod \"cert-manager-cainjector-7f985d654d-kf2h4\" (UID: \"b3a1fea5-f2ce-4047-b055-35cdaadd95c2\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.404410 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfstn\" (UniqueName: \"kubernetes.io/projected/9df19d98-0550-4720-bed1-056a83f77d6b-kube-api-access-rfstn\") pod \"cert-manager-5b446d88c5-ts58n\" (UID: \"9df19d98-0550-4720-bed1-056a83f77d6b\") " pod="cert-manager/cert-manager-5b446d88c5-ts58n" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.404530 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47xbj\" (UniqueName: \"kubernetes.io/projected/b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4-kube-api-access-47xbj\") pod \"cert-manager-webhook-5655c58dd6-stttj\" (UID: \"b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.505862 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmb5t\" (UniqueName: \"kubernetes.io/projected/b3a1fea5-f2ce-4047-b055-35cdaadd95c2-kube-api-access-gmb5t\") pod \"cert-manager-cainjector-7f985d654d-kf2h4\" (UID: \"b3a1fea5-f2ce-4047-b055-35cdaadd95c2\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.505937 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfstn\" (UniqueName: \"kubernetes.io/projected/9df19d98-0550-4720-bed1-056a83f77d6b-kube-api-access-rfstn\") pod \"cert-manager-5b446d88c5-ts58n\" (UID: \"9df19d98-0550-4720-bed1-056a83f77d6b\") " pod="cert-manager/cert-manager-5b446d88c5-ts58n" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.505973 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47xbj\" (UniqueName: \"kubernetes.io/projected/b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4-kube-api-access-47xbj\") pod \"cert-manager-webhook-5655c58dd6-stttj\" (UID: \"b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.528373 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47xbj\" (UniqueName: \"kubernetes.io/projected/b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4-kube-api-access-47xbj\") pod \"cert-manager-webhook-5655c58dd6-stttj\" (UID: \"b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.529042 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfstn\" (UniqueName: \"kubernetes.io/projected/9df19d98-0550-4720-bed1-056a83f77d6b-kube-api-access-rfstn\") pod \"cert-manager-5b446d88c5-ts58n\" (UID: \"9df19d98-0550-4720-bed1-056a83f77d6b\") " pod="cert-manager/cert-manager-5b446d88c5-ts58n" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.533736 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmb5t\" (UniqueName: \"kubernetes.io/projected/b3a1fea5-f2ce-4047-b055-35cdaadd95c2-kube-api-access-gmb5t\") pod \"cert-manager-cainjector-7f985d654d-kf2h4\" (UID: \"b3a1fea5-f2ce-4047-b055-35cdaadd95c2\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.568045 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.592427 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-ts58n" Dec 04 15:30:30 crc kubenswrapper[4676]: I1204 15:30:30.596315 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" Dec 04 15:30:31 crc kubenswrapper[4676]: I1204 15:30:31.008925 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kf2h4"] Dec 04 15:30:31 crc kubenswrapper[4676]: I1204 15:30:31.021077 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 15:30:31 crc kubenswrapper[4676]: I1204 15:30:31.068376 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-stttj"] Dec 04 15:30:31 crc kubenswrapper[4676]: I1204 15:30:31.080081 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-ts58n"] Dec 04 15:30:31 crc kubenswrapper[4676]: I1204 15:30:31.276639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" event={"ID":"b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4","Type":"ContainerStarted","Data":"0e26c1d8d7729fe265d93cae8413a7767ac56a6e94d7a3719e2ab23899578bfd"} Dec 04 15:30:31 crc kubenswrapper[4676]: I1204 15:30:31.277959 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-ts58n" event={"ID":"9df19d98-0550-4720-bed1-056a83f77d6b","Type":"ContainerStarted","Data":"9ad924fdf4cad97ccb98407e2d4081363657532ed16ac3df720d72954dc9a118"} Dec 04 15:30:31 crc kubenswrapper[4676]: I1204 15:30:31.278862 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" event={"ID":"b3a1fea5-f2ce-4047-b055-35cdaadd95c2","Type":"ContainerStarted","Data":"ef5bbe92072ad9234f94e76df6fa842a3a6998fc005c5ff53894dd97f761a3be"} Dec 04 15:30:35 crc kubenswrapper[4676]: I1204 15:30:35.305605 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-ts58n" event={"ID":"9df19d98-0550-4720-bed1-056a83f77d6b","Type":"ContainerStarted","Data":"d5c2a34392625b89254dcb47467f451d70945b32eda31b5b829945c46e268f48"} Dec 04 15:30:35 crc kubenswrapper[4676]: I1204 15:30:35.307667 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" event={"ID":"b3a1fea5-f2ce-4047-b055-35cdaadd95c2","Type":"ContainerStarted","Data":"5c764f8e7a95833ce06f1b13a145a1608920df310d237361153630047062bfe2"} Dec 04 15:30:35 crc kubenswrapper[4676]: I1204 15:30:35.332662 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-ts58n" podStartSLOduration=2.034942468 podStartE2EDuration="5.332626987s" podCreationTimestamp="2025-12-04 15:30:30 +0000 UTC" firstStartedPulling="2025-12-04 15:30:31.089125053 +0000 UTC m=+638.523794910" lastFinishedPulling="2025-12-04 15:30:34.386809572 +0000 UTC m=+641.821479429" observedRunningTime="2025-12-04 15:30:35.326057248 +0000 UTC m=+642.760727105" watchObservedRunningTime="2025-12-04 15:30:35.332626987 +0000 UTC m=+642.767296854" Dec 04 15:30:35 crc kubenswrapper[4676]: I1204 15:30:35.343757 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-kf2h4" podStartSLOduration=1.920201265 podStartE2EDuration="5.343736237s" podCreationTimestamp="2025-12-04 15:30:30 +0000 UTC" firstStartedPulling="2025-12-04 15:30:31.020652592 +0000 UTC m=+638.455322449" lastFinishedPulling="2025-12-04 15:30:34.444187564 +0000 UTC m=+641.878857421" observedRunningTime="2025-12-04 15:30:35.342502302 +0000 UTC m=+642.777172159" watchObservedRunningTime="2025-12-04 15:30:35.343736237 +0000 UTC m=+642.778406094" Dec 04 15:30:36 crc kubenswrapper[4676]: I1204 15:30:36.313677 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" event={"ID":"b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4","Type":"ContainerStarted","Data":"c9cf44d681407ca4cd406ca4a33e3d3c07a4f20936c074c8db95e080c5e27564"} Dec 04 15:30:36 crc kubenswrapper[4676]: I1204 15:30:36.314222 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" Dec 04 15:30:36 crc kubenswrapper[4676]: I1204 15:30:36.331994 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" podStartSLOduration=2.076044261 podStartE2EDuration="6.331970883s" podCreationTimestamp="2025-12-04 15:30:30 +0000 UTC" firstStartedPulling="2025-12-04 15:30:31.083773489 +0000 UTC m=+638.518443346" lastFinishedPulling="2025-12-04 15:30:35.339700111 +0000 UTC m=+642.774369968" observedRunningTime="2025-12-04 15:30:36.329828032 +0000 UTC m=+643.764497889" watchObservedRunningTime="2025-12-04 15:30:36.331970883 +0000 UTC m=+643.766640740" Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.598976 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-stttj" Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956008 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wmbt2"] Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956565 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-controller" containerID="cri-o://8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" gracePeriod=30 Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956647 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="sbdb" containerID="cri-o://1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" gracePeriod=30 Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956684 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" gracePeriod=30 Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956718 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-node" containerID="cri-o://ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" gracePeriod=30 Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956719 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-acl-logging" containerID="cri-o://b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" gracePeriod=30 Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956837 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="northd" containerID="cri-o://4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" gracePeriod=30 Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.956862 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="nbdb" containerID="cri-o://9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" gracePeriod=30 Dec 04 15:30:40 crc kubenswrapper[4676]: I1204 15:30:40.994058 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" containerID="cri-o://8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" gracePeriod=30 Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.204643 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ad0d70_0230_4055_a56e_d83c06c6e0b3.slice/crio-b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ad0d70_0230_4055_a56e_d83c06c6e0b3.slice/crio-1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ad0d70_0230_4055_a56e_d83c06c6e0b3.slice/crio-conmon-9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a.scope\": RecentStats: unable to find data in memory cache]" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.302296 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/3.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.306849 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovn-acl-logging/0.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.307558 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovn-controller/0.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.308760 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.358999 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/2.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.359654 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/1.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.359695 4676 generic.go:334] "Generic (PLEG): container finished" podID="2a201486-d4f3-4677-adad-4028d94e0623" containerID="8088b0e22f4f19774d73bca1f606c4eb2a1295199b115b5884111164ee215ff3" exitCode=2 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.359765 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerDied","Data":"8088b0e22f4f19774d73bca1f606c4eb2a1295199b115b5884111164ee215ff3"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.359811 4676 scope.go:117] "RemoveContainer" containerID="ceebc96cc115d1e5009d23c18de74d387658931e1fd0204651f7f1d7a309f5a5" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.360363 4676 scope.go:117] "RemoveContainer" containerID="8088b0e22f4f19774d73bca1f606c4eb2a1295199b115b5884111164ee215ff3" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.360724 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wch9m_openshift-multus(2a201486-d4f3-4677-adad-4028d94e0623)\"" pod="openshift-multus/multus-wch9m" podUID="2a201486-d4f3-4677-adad-4028d94e0623" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.369952 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovnkube-controller/3.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.370713 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-94pnm"] Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.370954 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="nbdb" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.370973 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="nbdb" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.370983 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.370990 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371000 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kubecfg-setup" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371006 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kubecfg-setup" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371014 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371020 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371027 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="northd" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371033 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="northd" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371042 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371047 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371056 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371061 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371072 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-acl-logging" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371077 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-acl-logging" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371084 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="sbdb" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371090 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="sbdb" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371098 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371104 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371113 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-node" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371120 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-node" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371127 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371133 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371262 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-ovn-metrics" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371276 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="nbdb" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371283 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371289 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371296 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371304 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="kube-rbac-proxy-node" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371311 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="northd" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371318 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371325 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="sbdb" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371332 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovn-acl-logging" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371340 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371347 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.371438 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.371445 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerName="ovnkube-controller" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.373444 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.377109 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovn-acl-logging/0.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.377959 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wmbt2_f1ad0d70-0230-4055-a56e-d83c06c6e0b3/ovn-controller/0.log" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.378566 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" exitCode=0 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.378672 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.378731 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.378686 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.378958 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" exitCode=0 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379820 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" exitCode=0 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379834 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" exitCode=0 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379844 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" exitCode=0 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379875 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" exitCode=0 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379884 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" exitCode=143 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379891 4676 generic.go:334] "Generic (PLEG): container finished" podID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" containerID="8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" exitCode=143 Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379928 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379960 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379971 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379980 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.379990 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380043 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380049 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380055 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380060 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380065 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380072 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380078 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380082 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380088 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380095 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380103 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380109 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380114 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380119 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380123 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380128 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380134 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380138 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380143 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380150 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380156 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380165 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380171 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380176 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380180 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380186 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380191 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380196 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380202 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380207 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380213 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380220 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wmbt2" event={"ID":"f1ad0d70-0230-4055-a56e-d83c06c6e0b3","Type":"ContainerDied","Data":"1aa9bf6672ad90ee6ed4581d5a45ad804e1c37d893bd8d72a0c5ef890f5738e2"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380228 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380237 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380243 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380248 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380253 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380258 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380263 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380268 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380273 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.380278 4676 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.397206 4676 scope.go:117] "RemoveContainer" containerID="8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.416051 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.434471 4676 scope.go:117] "RemoveContainer" containerID="1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.452877 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j6vk\" (UniqueName: \"kubernetes.io/projected/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-kube-api-access-6j6vk\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.452965 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-ovn-kubernetes\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453004 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-script-lib\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453042 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-env-overrides\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453065 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-openvswitch\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453085 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-bin\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453111 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-etc-openvswitch\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453138 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-netns\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453174 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-slash\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453200 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-kubelet\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453187 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453231 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-systemd\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453254 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-node-log\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453263 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453295 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-systemd-units\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453333 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-netd\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453359 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-var-lib-openvswitch\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453296 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453286 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453352 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453455 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-node-log" (OuterVolumeSpecName: "node-log") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453454 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-log-socket" (OuterVolumeSpecName: "log-socket") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453474 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453486 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453388 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-log-socket\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453513 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453537 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-slash" (OuterVolumeSpecName: "host-slash") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453546 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-ovn\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453558 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453582 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovn-node-metrics-cert\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453615 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453614 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453637 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-config\") pod \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\" (UID: \"f1ad0d70-0230-4055-a56e-d83c06c6e0b3\") " Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453664 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453678 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.453699 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454131 4676 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454150 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454159 4676 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454168 4676 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454177 4676 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454185 4676 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454194 4676 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454202 4676 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-slash\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454216 4676 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454224 4676 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-node-log\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454233 4676 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454242 4676 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454253 4676 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454261 4676 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-log-socket\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454271 4676 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454280 4676 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.454421 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.456755 4676 scope.go:117] "RemoveContainer" containerID="9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.460714 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-kube-api-access-6j6vk" (OuterVolumeSpecName: "kube-api-access-6j6vk") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "kube-api-access-6j6vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.461352 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.469142 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f1ad0d70-0230-4055-a56e-d83c06c6e0b3" (UID: "f1ad0d70-0230-4055-a56e-d83c06c6e0b3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.472280 4676 scope.go:117] "RemoveContainer" containerID="4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.485590 4676 scope.go:117] "RemoveContainer" containerID="be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.500516 4676 scope.go:117] "RemoveContainer" containerID="ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.519460 4676 scope.go:117] "RemoveContainer" containerID="b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.538690 4676 scope.go:117] "RemoveContainer" containerID="8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.553461 4676 scope.go:117] "RemoveContainer" containerID="99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555028 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-etc-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555075 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-ovnkube-config\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555138 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-node-log\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555203 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-env-overrides\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555229 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-cni-netd\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555254 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-ovnkube-script-lib\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555286 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c3c7869-de45-4c70-8669-79e28bc76420-ovn-node-metrics-cert\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555328 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-cni-bin\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555346 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-var-lib-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555363 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-kubelet\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555400 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-ovn\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555415 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-log-socket\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555444 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-systemd\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555461 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-run-netns\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555474 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555490 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-slash\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555521 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-run-ovn-kubernetes\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555552 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555579 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-systemd-units\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555598 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f876\" (UniqueName: \"kubernetes.io/projected/8c3c7869-de45-4c70-8669-79e28bc76420-kube-api-access-9f876\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555646 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555657 4676 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555666 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j6vk\" (UniqueName: \"kubernetes.io/projected/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-kube-api-access-6j6vk\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.555676 4676 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f1ad0d70-0230-4055-a56e-d83c06c6e0b3-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.568150 4676 scope.go:117] "RemoveContainer" containerID="8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.568647 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": container with ID starting with 8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace not found: ID does not exist" containerID="8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.568709 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} err="failed to get container status \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": rpc error: code = NotFound desc = could not find container \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": container with ID starting with 8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.568797 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.569220 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": container with ID starting with dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40 not found: ID does not exist" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.569257 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} err="failed to get container status \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": rpc error: code = NotFound desc = could not find container \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": container with ID starting with dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.569280 4676 scope.go:117] "RemoveContainer" containerID="1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.569593 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": container with ID starting with 1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641 not found: ID does not exist" containerID="1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.569630 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} err="failed to get container status \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": rpc error: code = NotFound desc = could not find container \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": container with ID starting with 1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.569648 4676 scope.go:117] "RemoveContainer" containerID="9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.569874 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": container with ID starting with 9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a not found: ID does not exist" containerID="9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.569949 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} err="failed to get container status \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": rpc error: code = NotFound desc = could not find container \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": container with ID starting with 9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.569968 4676 scope.go:117] "RemoveContainer" containerID="4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.570214 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": container with ID starting with 4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8 not found: ID does not exist" containerID="4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.570235 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} err="failed to get container status \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": rpc error: code = NotFound desc = could not find container \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": container with ID starting with 4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.570247 4676 scope.go:117] "RemoveContainer" containerID="be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.570455 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": container with ID starting with be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28 not found: ID does not exist" containerID="be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.570476 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} err="failed to get container status \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": rpc error: code = NotFound desc = could not find container \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": container with ID starting with be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.570486 4676 scope.go:117] "RemoveContainer" containerID="ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.570720 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": container with ID starting with ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be not found: ID does not exist" containerID="ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.570750 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} err="failed to get container status \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": rpc error: code = NotFound desc = could not find container \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": container with ID starting with ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.570772 4676 scope.go:117] "RemoveContainer" containerID="b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.571065 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": container with ID starting with b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334 not found: ID does not exist" containerID="b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571093 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} err="failed to get container status \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": rpc error: code = NotFound desc = could not find container \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": container with ID starting with b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571110 4676 scope.go:117] "RemoveContainer" containerID="8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.571355 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": container with ID starting with 8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4 not found: ID does not exist" containerID="8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571397 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} err="failed to get container status \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": rpc error: code = NotFound desc = could not find container \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": container with ID starting with 8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571412 4676 scope.go:117] "RemoveContainer" containerID="99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247" Dec 04 15:30:41 crc kubenswrapper[4676]: E1204 15:30:41.571636 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": container with ID starting with 99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247 not found: ID does not exist" containerID="99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571666 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} err="failed to get container status \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": rpc error: code = NotFound desc = could not find container \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": container with ID starting with 99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571684 4676 scope.go:117] "RemoveContainer" containerID="8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571895 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} err="failed to get container status \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": rpc error: code = NotFound desc = could not find container \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": container with ID starting with 8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.571943 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.572152 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} err="failed to get container status \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": rpc error: code = NotFound desc = could not find container \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": container with ID starting with dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.572178 4676 scope.go:117] "RemoveContainer" containerID="1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.572406 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} err="failed to get container status \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": rpc error: code = NotFound desc = could not find container \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": container with ID starting with 1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.572429 4676 scope.go:117] "RemoveContainer" containerID="9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.572664 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} err="failed to get container status \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": rpc error: code = NotFound desc = could not find container \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": container with ID starting with 9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.572687 4676 scope.go:117] "RemoveContainer" containerID="4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.573323 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} err="failed to get container status \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": rpc error: code = NotFound desc = could not find container \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": container with ID starting with 4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.573471 4676 scope.go:117] "RemoveContainer" containerID="be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.573729 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} err="failed to get container status \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": rpc error: code = NotFound desc = could not find container \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": container with ID starting with be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.573766 4676 scope.go:117] "RemoveContainer" containerID="ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574042 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} err="failed to get container status \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": rpc error: code = NotFound desc = could not find container \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": container with ID starting with ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574074 4676 scope.go:117] "RemoveContainer" containerID="b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574337 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} err="failed to get container status \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": rpc error: code = NotFound desc = could not find container \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": container with ID starting with b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574367 4676 scope.go:117] "RemoveContainer" containerID="8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574586 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} err="failed to get container status \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": rpc error: code = NotFound desc = could not find container \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": container with ID starting with 8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574613 4676 scope.go:117] "RemoveContainer" containerID="99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574848 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} err="failed to get container status \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": rpc error: code = NotFound desc = could not find container \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": container with ID starting with 99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.574879 4676 scope.go:117] "RemoveContainer" containerID="8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.575190 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} err="failed to get container status \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": rpc error: code = NotFound desc = could not find container \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": container with ID starting with 8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.575226 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.575417 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} err="failed to get container status \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": rpc error: code = NotFound desc = could not find container \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": container with ID starting with dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.575445 4676 scope.go:117] "RemoveContainer" containerID="1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.575893 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} err="failed to get container status \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": rpc error: code = NotFound desc = could not find container \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": container with ID starting with 1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.575943 4676 scope.go:117] "RemoveContainer" containerID="9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.576232 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} err="failed to get container status \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": rpc error: code = NotFound desc = could not find container \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": container with ID starting with 9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.576256 4676 scope.go:117] "RemoveContainer" containerID="4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.576498 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} err="failed to get container status \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": rpc error: code = NotFound desc = could not find container \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": container with ID starting with 4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.576516 4676 scope.go:117] "RemoveContainer" containerID="be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.576734 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} err="failed to get container status \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": rpc error: code = NotFound desc = could not find container \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": container with ID starting with be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.576762 4676 scope.go:117] "RemoveContainer" containerID="ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.577164 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} err="failed to get container status \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": rpc error: code = NotFound desc = could not find container \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": container with ID starting with ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.577190 4676 scope.go:117] "RemoveContainer" containerID="b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.577442 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} err="failed to get container status \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": rpc error: code = NotFound desc = could not find container \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": container with ID starting with b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.577461 4676 scope.go:117] "RemoveContainer" containerID="8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.577691 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} err="failed to get container status \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": rpc error: code = NotFound desc = could not find container \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": container with ID starting with 8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.577716 4676 scope.go:117] "RemoveContainer" containerID="99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.577986 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} err="failed to get container status \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": rpc error: code = NotFound desc = could not find container \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": container with ID starting with 99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.578014 4676 scope.go:117] "RemoveContainer" containerID="8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.578278 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace"} err="failed to get container status \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": rpc error: code = NotFound desc = could not find container \"8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace\": container with ID starting with 8834d7d02140c692f1a998ae1f16c5d9e11422aeb62a2441bcbb3b274ea5cace not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.578329 4676 scope.go:117] "RemoveContainer" containerID="dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.578595 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40"} err="failed to get container status \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": rpc error: code = NotFound desc = could not find container \"dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40\": container with ID starting with dde2b079838176c983d693f9f4e512b3c3a3bae4f6e9c3219506d2c3da21db40 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.578622 4676 scope.go:117] "RemoveContainer" containerID="1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.578853 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641"} err="failed to get container status \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": rpc error: code = NotFound desc = could not find container \"1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641\": container with ID starting with 1f2dd927db9989e6bd28689c817002df3675a413fbf1e4c919f4f64919b77641 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.578880 4676 scope.go:117] "RemoveContainer" containerID="9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.579200 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a"} err="failed to get container status \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": rpc error: code = NotFound desc = could not find container \"9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a\": container with ID starting with 9065ac151bb455a13e50a0502c3b9e151bd42cce86513d9ccfbfe9ebb5720f4a not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.579221 4676 scope.go:117] "RemoveContainer" containerID="4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.579489 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8"} err="failed to get container status \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": rpc error: code = NotFound desc = could not find container \"4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8\": container with ID starting with 4a4a6a0db31f09569c10983309d76c71907f8e447924fb053d3f73890d22f7f8 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.579514 4676 scope.go:117] "RemoveContainer" containerID="be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.579761 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28"} err="failed to get container status \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": rpc error: code = NotFound desc = could not find container \"be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28\": container with ID starting with be21857ec48b5bfd4a86ea243d0ba4f96fd2bde534b3a5973a4a2e6a102eee28 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.579784 4676 scope.go:117] "RemoveContainer" containerID="ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.580001 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be"} err="failed to get container status \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": rpc error: code = NotFound desc = could not find container \"ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be\": container with ID starting with ab54f5c4e210e66a7382eb9f38ad51cdaa1b29ec1f79a7ef85bc18580db649be not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.580031 4676 scope.go:117] "RemoveContainer" containerID="b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.580291 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334"} err="failed to get container status \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": rpc error: code = NotFound desc = could not find container \"b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334\": container with ID starting with b229cd5cebde3ab2f902f72a31601b4ab478c949465fe1b226d9c86884e08334 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.580310 4676 scope.go:117] "RemoveContainer" containerID="8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.580704 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4"} err="failed to get container status \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": rpc error: code = NotFound desc = could not find container \"8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4\": container with ID starting with 8cf53f5e77ce2dc6599eb8681289fe27a3957d9f993fe311e27adb46e17256c4 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.580726 4676 scope.go:117] "RemoveContainer" containerID="99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.580978 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247"} err="failed to get container status \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": rpc error: code = NotFound desc = could not find container \"99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247\": container with ID starting with 99908909f434519965f27dc26cb84c5aab8353dfa1a5a2d04ed6da7c3a41a247 not found: ID does not exist" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.656959 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-node-log\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657038 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-env-overrides\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657062 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-cni-netd\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657090 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-ovnkube-script-lib\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657098 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-node-log\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657110 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c3c7869-de45-4c70-8669-79e28bc76420-ovn-node-metrics-cert\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-var-lib-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657202 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-cni-bin\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657250 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-kubelet\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657263 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-cni-netd\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657330 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-kubelet\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657275 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-ovn\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657295 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-var-lib-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657393 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-log-socket\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657410 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-cni-bin\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657476 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-systemd\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657441 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-systemd\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-slash\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657557 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-run-netns\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657576 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657586 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-slash\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657476 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-log-socket\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657607 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-run-ovn-kubernetes\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657624 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-run-netns\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657649 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-run-ovn-kubernetes\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657662 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657300 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-run-ovn\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657670 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657725 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-systemd-units\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657741 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f876\" (UniqueName: \"kubernetes.io/projected/8c3c7869-de45-4c70-8669-79e28bc76420-kube-api-access-9f876\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657751 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-env-overrides\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657768 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657773 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-etc-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657792 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-etc-openvswitch\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657795 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8c3c7869-de45-4c70-8669-79e28bc76420-systemd-units\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657842 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-ovnkube-script-lib\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.657849 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-ovnkube-config\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.658389 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c3c7869-de45-4c70-8669-79e28bc76420-ovnkube-config\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.660107 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c3c7869-de45-4c70-8669-79e28bc76420-ovn-node-metrics-cert\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.672751 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f876\" (UniqueName: \"kubernetes.io/projected/8c3c7869-de45-4c70-8669-79e28bc76420-kube-api-access-9f876\") pod \"ovnkube-node-94pnm\" (UID: \"8c3c7869-de45-4c70-8669-79e28bc76420\") " pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.704967 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.719979 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wmbt2"] Dec 04 15:30:41 crc kubenswrapper[4676]: I1204 15:30:41.723987 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-wmbt2"] Dec 04 15:30:42 crc kubenswrapper[4676]: I1204 15:30:42.390746 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/2.log" Dec 04 15:30:42 crc kubenswrapper[4676]: I1204 15:30:42.394017 4676 generic.go:334] "Generic (PLEG): container finished" podID="8c3c7869-de45-4c70-8669-79e28bc76420" containerID="5012e6b26629cfc6bc1b18a42fbc4dcac085d054f698b7e05620a34198c63e05" exitCode=0 Dec 04 15:30:42 crc kubenswrapper[4676]: I1204 15:30:42.394057 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerDied","Data":"5012e6b26629cfc6bc1b18a42fbc4dcac085d054f698b7e05620a34198c63e05"} Dec 04 15:30:42 crc kubenswrapper[4676]: I1204 15:30:42.394083 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"b4aff7d5c3c9a46b24ff5d4540e1be66222461b66aaef364dbf9463f5a5314bc"} Dec 04 15:30:43 crc kubenswrapper[4676]: I1204 15:30:43.390740 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ad0d70-0230-4055-a56e-d83c06c6e0b3" path="/var/lib/kubelet/pods/f1ad0d70-0230-4055-a56e-d83c06c6e0b3/volumes" Dec 04 15:30:43 crc kubenswrapper[4676]: I1204 15:30:43.401403 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"de5ae0d4a13831ca6737035fd1a1f46b850b0ef8cba0cd82c39a9b1bf8ff25ad"} Dec 04 15:30:43 crc kubenswrapper[4676]: I1204 15:30:43.401439 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"1045da12058217783315e0f15eb898f2cd585d20926df485fc4732066e27b5c2"} Dec 04 15:30:43 crc kubenswrapper[4676]: I1204 15:30:43.401450 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"da8b8b7a2c458b674eb54053fd90ec601ef37f1751a023566e25aa6596feb496"} Dec 04 15:30:43 crc kubenswrapper[4676]: I1204 15:30:43.401460 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"3ac6b23176db4d434705e9f6302ff0e0e1d79ae7e3187785ddf49acb9a116a20"} Dec 04 15:30:43 crc kubenswrapper[4676]: I1204 15:30:43.401472 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"bd8017f7aa190627828bcb5a952905256c3af4fa2e6ab0fcea09171ebd36a2e0"} Dec 04 15:30:43 crc kubenswrapper[4676]: I1204 15:30:43.401481 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"100123b954bbe91e298ccbf2a2a3e088a86c4ffe215035ebdd959e328fc2d41d"} Dec 04 15:30:45 crc kubenswrapper[4676]: I1204 15:30:45.414897 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"0f86e26f541cff0ccca730057c72a5aa84c85cad6cc371e514e1cbdcee75d9ac"} Dec 04 15:30:48 crc kubenswrapper[4676]: I1204 15:30:48.434514 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" event={"ID":"8c3c7869-de45-4c70-8669-79e28bc76420","Type":"ContainerStarted","Data":"9a66b1de0ab0397cc3c8880ead5244577dca872bf48687ffd6a59bb9554293bf"} Dec 04 15:30:48 crc kubenswrapper[4676]: I1204 15:30:48.435017 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:48 crc kubenswrapper[4676]: I1204 15:30:48.435531 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:48 crc kubenswrapper[4676]: I1204 15:30:48.435570 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:48 crc kubenswrapper[4676]: I1204 15:30:48.465442 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:48 crc kubenswrapper[4676]: I1204 15:30:48.466080 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:30:48 crc kubenswrapper[4676]: I1204 15:30:48.469048 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" podStartSLOduration=7.469033759 podStartE2EDuration="7.469033759s" podCreationTimestamp="2025-12-04 15:30:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:30:48.466890928 +0000 UTC m=+655.901560795" watchObservedRunningTime="2025-12-04 15:30:48.469033759 +0000 UTC m=+655.903703636" Dec 04 15:30:53 crc kubenswrapper[4676]: I1204 15:30:53.388381 4676 scope.go:117] "RemoveContainer" containerID="8088b0e22f4f19774d73bca1f606c4eb2a1295199b115b5884111164ee215ff3" Dec 04 15:30:53 crc kubenswrapper[4676]: E1204 15:30:53.389404 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-wch9m_openshift-multus(2a201486-d4f3-4677-adad-4028d94e0623)\"" pod="openshift-multus/multus-wch9m" podUID="2a201486-d4f3-4677-adad-4028d94e0623" Dec 04 15:31:04 crc kubenswrapper[4676]: I1204 15:31:04.384196 4676 scope.go:117] "RemoveContainer" containerID="8088b0e22f4f19774d73bca1f606c4eb2a1295199b115b5884111164ee215ff3" Dec 04 15:31:05 crc kubenswrapper[4676]: I1204 15:31:05.526778 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wch9m_2a201486-d4f3-4677-adad-4028d94e0623/kube-multus/2.log" Dec 04 15:31:05 crc kubenswrapper[4676]: I1204 15:31:05.527180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wch9m" event={"ID":"2a201486-d4f3-4677-adad-4028d94e0623","Type":"ContainerStarted","Data":"17c265d74da906cbea839cbf29503e0b869fa84e64e57f3bb71af4c3ea10ea57"} Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.732171 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r"] Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.733770 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.735887 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.746660 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r"] Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.797281 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.797463 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh7sq\" (UniqueName: \"kubernetes.io/projected/6d73b25e-dd84-468b-81dd-5d584a083fe0-kube-api-access-gh7sq\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.797521 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.898245 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.898430 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh7sq\" (UniqueName: \"kubernetes.io/projected/6d73b25e-dd84-468b-81dd-5d584a083fe0-kube-api-access-gh7sq\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.898482 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.898939 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.899062 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:09 crc kubenswrapper[4676]: I1204 15:31:09.919533 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh7sq\" (UniqueName: \"kubernetes.io/projected/6d73b25e-dd84-468b-81dd-5d584a083fe0-kube-api-access-gh7sq\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:10 crc kubenswrapper[4676]: I1204 15:31:10.050719 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:10 crc kubenswrapper[4676]: I1204 15:31:10.470183 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r"] Dec 04 15:31:10 crc kubenswrapper[4676]: W1204 15:31:10.477894 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d73b25e_dd84_468b_81dd_5d584a083fe0.slice/crio-7183e2e210d376539e4552f8e412b651c00355ceb7a8486b414b0441fbe4bb8f WatchSource:0}: Error finding container 7183e2e210d376539e4552f8e412b651c00355ceb7a8486b414b0441fbe4bb8f: Status 404 returned error can't find the container with id 7183e2e210d376539e4552f8e412b651c00355ceb7a8486b414b0441fbe4bb8f Dec 04 15:31:10 crc kubenswrapper[4676]: I1204 15:31:10.700984 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" event={"ID":"6d73b25e-dd84-468b-81dd-5d584a083fe0","Type":"ContainerStarted","Data":"7183e2e210d376539e4552f8e412b651c00355ceb7a8486b414b0441fbe4bb8f"} Dec 04 15:31:11 crc kubenswrapper[4676]: I1204 15:31:11.710825 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" event={"ID":"6d73b25e-dd84-468b-81dd-5d584a083fe0","Type":"ContainerStarted","Data":"2a8198ac3da1e2e244ceb744e418fdfcde81c10a923b0b622e4576f2e1838105"} Dec 04 15:31:11 crc kubenswrapper[4676]: I1204 15:31:11.734799 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-94pnm" Dec 04 15:31:12 crc kubenswrapper[4676]: I1204 15:31:12.719516 4676 generic.go:334] "Generic (PLEG): container finished" podID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerID="2a8198ac3da1e2e244ceb744e418fdfcde81c10a923b0b622e4576f2e1838105" exitCode=0 Dec 04 15:31:12 crc kubenswrapper[4676]: I1204 15:31:12.719600 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" event={"ID":"6d73b25e-dd84-468b-81dd-5d584a083fe0","Type":"ContainerDied","Data":"2a8198ac3da1e2e244ceb744e418fdfcde81c10a923b0b622e4576f2e1838105"} Dec 04 15:31:14 crc kubenswrapper[4676]: I1204 15:31:14.734591 4676 generic.go:334] "Generic (PLEG): container finished" podID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerID="ad248b47e4e9395b7f9c6ee8408c0dce343ce020d15e9da7fa7f7daca69cca0f" exitCode=0 Dec 04 15:31:14 crc kubenswrapper[4676]: I1204 15:31:14.734710 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" event={"ID":"6d73b25e-dd84-468b-81dd-5d584a083fe0","Type":"ContainerDied","Data":"ad248b47e4e9395b7f9c6ee8408c0dce343ce020d15e9da7fa7f7daca69cca0f"} Dec 04 15:31:15 crc kubenswrapper[4676]: I1204 15:31:15.742276 4676 generic.go:334] "Generic (PLEG): container finished" podID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerID="154b36210fd46cce49f5299c7bc67a26000ccf899664cfc6aead3888e4bcc33d" exitCode=0 Dec 04 15:31:15 crc kubenswrapper[4676]: I1204 15:31:15.742297 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" event={"ID":"6d73b25e-dd84-468b-81dd-5d584a083fe0","Type":"ContainerDied","Data":"154b36210fd46cce49f5299c7bc67a26000ccf899664cfc6aead3888e4bcc33d"} Dec 04 15:31:16 crc kubenswrapper[4676]: I1204 15:31:16.967576 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.124419 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh7sq\" (UniqueName: \"kubernetes.io/projected/6d73b25e-dd84-468b-81dd-5d584a083fe0-kube-api-access-gh7sq\") pod \"6d73b25e-dd84-468b-81dd-5d584a083fe0\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.124498 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-bundle\") pod \"6d73b25e-dd84-468b-81dd-5d584a083fe0\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.124581 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-util\") pod \"6d73b25e-dd84-468b-81dd-5d584a083fe0\" (UID: \"6d73b25e-dd84-468b-81dd-5d584a083fe0\") " Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.127104 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-bundle" (OuterVolumeSpecName: "bundle") pod "6d73b25e-dd84-468b-81dd-5d584a083fe0" (UID: "6d73b25e-dd84-468b-81dd-5d584a083fe0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.131185 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d73b25e-dd84-468b-81dd-5d584a083fe0-kube-api-access-gh7sq" (OuterVolumeSpecName: "kube-api-access-gh7sq") pod "6d73b25e-dd84-468b-81dd-5d584a083fe0" (UID: "6d73b25e-dd84-468b-81dd-5d584a083fe0"). InnerVolumeSpecName "kube-api-access-gh7sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.137380 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-util" (OuterVolumeSpecName: "util") pod "6d73b25e-dd84-468b-81dd-5d584a083fe0" (UID: "6d73b25e-dd84-468b-81dd-5d584a083fe0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.225972 4676 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-util\") on node \"crc\" DevicePath \"\"" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.226016 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh7sq\" (UniqueName: \"kubernetes.io/projected/6d73b25e-dd84-468b-81dd-5d584a083fe0-kube-api-access-gh7sq\") on node \"crc\" DevicePath \"\"" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.226031 4676 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6d73b25e-dd84-468b-81dd-5d584a083fe0-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.753833 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" event={"ID":"6d73b25e-dd84-468b-81dd-5d584a083fe0","Type":"ContainerDied","Data":"7183e2e210d376539e4552f8e412b651c00355ceb7a8486b414b0441fbe4bb8f"} Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.753890 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r" Dec 04 15:31:17 crc kubenswrapper[4676]: I1204 15:31:17.753881 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7183e2e210d376539e4552f8e412b651c00355ceb7a8486b414b0441fbe4bb8f" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.299410 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st"] Dec 04 15:31:29 crc kubenswrapper[4676]: E1204 15:31:29.300375 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerName="util" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.300411 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerName="util" Dec 04 15:31:29 crc kubenswrapper[4676]: E1204 15:31:29.300432 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerName="pull" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.300441 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerName="pull" Dec 04 15:31:29 crc kubenswrapper[4676]: E1204 15:31:29.300448 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerName="extract" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.300457 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerName="extract" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.300640 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d73b25e-dd84-468b-81dd-5d584a083fe0" containerName="extract" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.301411 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.305045 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.305173 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-s9g8d" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.305186 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.320568 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.346427 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.347354 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.355709 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.356541 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-ng4fg" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.372213 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.375654 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.398251 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.412820 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.455024 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff573696-bc37-470b-a8b6-14c5218baa8f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl\" (UID: \"ff573696-bc37-470b-a8b6-14c5218baa8f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.456115 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk428\" (UniqueName: \"kubernetes.io/projected/cc91d5b7-ea24-4585-9ac8-bd227c1a186e-kube-api-access-sk428\") pod \"obo-prometheus-operator-668cf9dfbb-vc8st\" (UID: \"cc91d5b7-ea24-4585-9ac8-bd227c1a186e\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.456226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff573696-bc37-470b-a8b6-14c5218baa8f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl\" (UID: \"ff573696-bc37-470b-a8b6-14c5218baa8f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.508418 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-fl2ph"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.509551 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.511958 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-dfqz6" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.513139 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.521340 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-fl2ph"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.659899 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff573696-bc37-470b-a8b6-14c5218baa8f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl\" (UID: \"ff573696-bc37-470b-a8b6-14c5218baa8f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.659986 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/88a9075d-6a0a-4172-b28c-979ad7fff84b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-fl2ph\" (UID: \"88a9075d-6a0a-4172-b28c-979ad7fff84b\") " pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.660041 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk428\" (UniqueName: \"kubernetes.io/projected/cc91d5b7-ea24-4585-9ac8-bd227c1a186e-kube-api-access-sk428\") pod \"obo-prometheus-operator-668cf9dfbb-vc8st\" (UID: \"cc91d5b7-ea24-4585-9ac8-bd227c1a186e\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.660077 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d99fa21-223e-4928-a32c-52a3ccbd69d4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx\" (UID: \"6d99fa21-223e-4928-a32c-52a3ccbd69d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.660160 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff573696-bc37-470b-a8b6-14c5218baa8f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl\" (UID: \"ff573696-bc37-470b-a8b6-14c5218baa8f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.660185 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b62jk\" (UniqueName: \"kubernetes.io/projected/88a9075d-6a0a-4172-b28c-979ad7fff84b-kube-api-access-b62jk\") pod \"observability-operator-d8bb48f5d-fl2ph\" (UID: \"88a9075d-6a0a-4172-b28c-979ad7fff84b\") " pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.660228 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d99fa21-223e-4928-a32c-52a3ccbd69d4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx\" (UID: \"6d99fa21-223e-4928-a32c-52a3ccbd69d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.667896 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ff573696-bc37-470b-a8b6-14c5218baa8f-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl\" (UID: \"ff573696-bc37-470b-a8b6-14c5218baa8f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.668409 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ff573696-bc37-470b-a8b6-14c5218baa8f-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl\" (UID: \"ff573696-bc37-470b-a8b6-14c5218baa8f\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.673434 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.686928 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk428\" (UniqueName: \"kubernetes.io/projected/cc91d5b7-ea24-4585-9ac8-bd227c1a186e-kube-api-access-sk428\") pod \"obo-prometheus-operator-668cf9dfbb-vc8st\" (UID: \"cc91d5b7-ea24-4585-9ac8-bd227c1a186e\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.761000 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d99fa21-223e-4928-a32c-52a3ccbd69d4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx\" (UID: \"6d99fa21-223e-4928-a32c-52a3ccbd69d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.761083 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b62jk\" (UniqueName: \"kubernetes.io/projected/88a9075d-6a0a-4172-b28c-979ad7fff84b-kube-api-access-b62jk\") pod \"observability-operator-d8bb48f5d-fl2ph\" (UID: \"88a9075d-6a0a-4172-b28c-979ad7fff84b\") " pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.761131 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d99fa21-223e-4928-a32c-52a3ccbd69d4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx\" (UID: \"6d99fa21-223e-4928-a32c-52a3ccbd69d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.761938 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/88a9075d-6a0a-4172-b28c-979ad7fff84b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-fl2ph\" (UID: \"88a9075d-6a0a-4172-b28c-979ad7fff84b\") " pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.766569 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d99fa21-223e-4928-a32c-52a3ccbd69d4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx\" (UID: \"6d99fa21-223e-4928-a32c-52a3ccbd69d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.771037 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/88a9075d-6a0a-4172-b28c-979ad7fff84b-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-fl2ph\" (UID: \"88a9075d-6a0a-4172-b28c-979ad7fff84b\") " pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.771406 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d99fa21-223e-4928-a32c-52a3ccbd69d4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx\" (UID: \"6d99fa21-223e-4928-a32c-52a3ccbd69d4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.802353 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b62jk\" (UniqueName: \"kubernetes.io/projected/88a9075d-6a0a-4172-b28c-979ad7fff84b-kube-api-access-b62jk\") pod \"observability-operator-d8bb48f5d-fl2ph\" (UID: \"88a9075d-6a0a-4172-b28c-979ad7fff84b\") " pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.834369 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.901365 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-lxcjx"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.902285 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.904072 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-fxvvf" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.921081 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-lxcjx"] Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.936268 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" Dec 04 15:31:29 crc kubenswrapper[4676]: I1204 15:31:29.997179 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.136156 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/667d2ce6-ef89-4b36-a200-194e5f7861ad-openshift-service-ca\") pod \"perses-operator-5446b9c989-lxcjx\" (UID: \"667d2ce6-ef89-4b36-a200-194e5f7861ad\") " pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.136253 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs47s\" (UniqueName: \"kubernetes.io/projected/667d2ce6-ef89-4b36-a200-194e5f7861ad-kube-api-access-gs47s\") pod \"perses-operator-5446b9c989-lxcjx\" (UID: \"667d2ce6-ef89-4b36-a200-194e5f7861ad\") " pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.239237 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs47s\" (UniqueName: \"kubernetes.io/projected/667d2ce6-ef89-4b36-a200-194e5f7861ad-kube-api-access-gs47s\") pod \"perses-operator-5446b9c989-lxcjx\" (UID: \"667d2ce6-ef89-4b36-a200-194e5f7861ad\") " pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.239332 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/667d2ce6-ef89-4b36-a200-194e5f7861ad-openshift-service-ca\") pod \"perses-operator-5446b9c989-lxcjx\" (UID: \"667d2ce6-ef89-4b36-a200-194e5f7861ad\") " pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.240393 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/667d2ce6-ef89-4b36-a200-194e5f7861ad-openshift-service-ca\") pod \"perses-operator-5446b9c989-lxcjx\" (UID: \"667d2ce6-ef89-4b36-a200-194e5f7861ad\") " pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.265340 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs47s\" (UniqueName: \"kubernetes.io/projected/667d2ce6-ef89-4b36-a200-194e5f7861ad-kube-api-access-gs47s\") pod \"perses-operator-5446b9c989-lxcjx\" (UID: \"667d2ce6-ef89-4b36-a200-194e5f7861ad\") " pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.391602 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl"] Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.556499 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.597712 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-fl2ph"] Dec 04 15:31:30 crc kubenswrapper[4676]: W1204 15:31:30.640275 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88a9075d_6a0a_4172_b28c_979ad7fff84b.slice/crio-0324e93329f334eff6181e0e335c013c08bf56255306c20bbb7b67fec8cf70e5 WatchSource:0}: Error finding container 0324e93329f334eff6181e0e335c013c08bf56255306c20bbb7b67fec8cf70e5: Status 404 returned error can't find the container with id 0324e93329f334eff6181e0e335c013c08bf56255306c20bbb7b67fec8cf70e5 Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.813114 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st"] Dec 04 15:31:30 crc kubenswrapper[4676]: I1204 15:31:30.898337 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx"] Dec 04 15:31:31 crc kubenswrapper[4676]: I1204 15:31:31.177364 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" event={"ID":"6d99fa21-223e-4928-a32c-52a3ccbd69d4","Type":"ContainerStarted","Data":"ac23f71cb5be56280603896662b7adda4c14e88b5d7a6e48cbff7d25af9761c0"} Dec 04 15:31:31 crc kubenswrapper[4676]: I1204 15:31:31.187283 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" event={"ID":"88a9075d-6a0a-4172-b28c-979ad7fff84b","Type":"ContainerStarted","Data":"0324e93329f334eff6181e0e335c013c08bf56255306c20bbb7b67fec8cf70e5"} Dec 04 15:31:31 crc kubenswrapper[4676]: I1204 15:31:31.190147 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" event={"ID":"ff573696-bc37-470b-a8b6-14c5218baa8f","Type":"ContainerStarted","Data":"8b3e7b82fff7a19eba5fd94688181f510816142e26f3f04c4949012dfb7898c8"} Dec 04 15:31:31 crc kubenswrapper[4676]: I1204 15:31:31.204202 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" event={"ID":"cc91d5b7-ea24-4585-9ac8-bd227c1a186e","Type":"ContainerStarted","Data":"0e51b353020d9739700da6e6d72aecc2438fdda0846a28325b23aa80c4569a25"} Dec 04 15:31:31 crc kubenswrapper[4676]: I1204 15:31:31.210678 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-lxcjx"] Dec 04 15:31:32 crc kubenswrapper[4676]: I1204 15:31:32.215844 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" event={"ID":"667d2ce6-ef89-4b36-a200-194e5f7861ad","Type":"ContainerStarted","Data":"b4db8fa184ce241d35c8a5b858887d57c9f01788606a5c112dd01e9daeede292"} Dec 04 15:31:46 crc kubenswrapper[4676]: I1204 15:31:46.026606 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:31:46 crc kubenswrapper[4676]: I1204 15:31:46.027094 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:31:56 crc kubenswrapper[4676]: E1204 15:31:56.914601 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 04 15:31:56 crc kubenswrapper[4676]: E1204 15:31:56.915512 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sk428,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-vc8st_openshift-operators(cc91d5b7-ea24-4585-9ac8-bd227c1a186e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:31:56 crc kubenswrapper[4676]: E1204 15:31:56.916780 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" podUID="cc91d5b7-ea24-4585-9ac8-bd227c1a186e" Dec 04 15:31:57 crc kubenswrapper[4676]: E1204 15:31:57.060480 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" podUID="cc91d5b7-ea24-4585-9ac8-bd227c1a186e" Dec 04 15:31:57 crc kubenswrapper[4676]: E1204 15:31:57.508988 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Dec 04 15:31:57 crc kubenswrapper[4676]: E1204 15:31:57.509540 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gs47s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-lxcjx_openshift-operators(667d2ce6-ef89-4b36-a200-194e5f7861ad): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:31:57 crc kubenswrapper[4676]: E1204 15:31:57.511208 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" podUID="667d2ce6-ef89-4b36-a200-194e5f7861ad" Dec 04 15:31:58 crc kubenswrapper[4676]: E1204 15:31:58.067825 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" podUID="667d2ce6-ef89-4b36-a200-194e5f7861ad" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.313702 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.314191 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b62jk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-fl2ph_openshift-operators(88a9075d-6a0a-4172-b28c-979ad7fff84b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.315456 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" podUID="88a9075d-6a0a-4172-b28c-979ad7fff84b" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.621023 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.621354 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx_openshift-operators(6d99fa21-223e-4928-a32c-52a3ccbd69d4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.622576 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" podUID="6d99fa21-223e-4928-a32c-52a3ccbd69d4" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.630752 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.631005 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl_openshift-operators(ff573696-bc37-470b-a8b6-14c5218baa8f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 04 15:32:00 crc kubenswrapper[4676]: E1204 15:32:00.632201 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" podUID="ff573696-bc37-470b-a8b6-14c5218baa8f" Dec 04 15:32:01 crc kubenswrapper[4676]: E1204 15:32:01.137637 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" podUID="88a9075d-6a0a-4172-b28c-979ad7fff84b" Dec 04 15:32:01 crc kubenswrapper[4676]: E1204 15:32:01.137637 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" podUID="ff573696-bc37-470b-a8b6-14c5218baa8f" Dec 04 15:32:01 crc kubenswrapper[4676]: E1204 15:32:01.137641 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" podUID="6d99fa21-223e-4928-a32c-52a3ccbd69d4" Dec 04 15:32:13 crc kubenswrapper[4676]: I1204 15:32:13.342734 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" event={"ID":"667d2ce6-ef89-4b36-a200-194e5f7861ad","Type":"ContainerStarted","Data":"56d6cd141b55f543949a2c68f220966fd0a6a61d54e2f92852f3c43e6b1fb87b"} Dec 04 15:32:13 crc kubenswrapper[4676]: I1204 15:32:13.343552 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:32:13 crc kubenswrapper[4676]: I1204 15:32:13.344133 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" event={"ID":"cc91d5b7-ea24-4585-9ac8-bd227c1a186e","Type":"ContainerStarted","Data":"30cbc4ae777b6376a9954986e580eacfcfc96e24a17fab0140374f431d38f608"} Dec 04 15:32:13 crc kubenswrapper[4676]: I1204 15:32:13.369157 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" podStartSLOduration=2.99093897 podStartE2EDuration="44.369117256s" podCreationTimestamp="2025-12-04 15:31:29 +0000 UTC" firstStartedPulling="2025-12-04 15:31:31.229462187 +0000 UTC m=+698.664132044" lastFinishedPulling="2025-12-04 15:32:12.607640473 +0000 UTC m=+740.042310330" observedRunningTime="2025-12-04 15:32:13.361202109 +0000 UTC m=+740.795871996" watchObservedRunningTime="2025-12-04 15:32:13.369117256 +0000 UTC m=+740.803787113" Dec 04 15:32:13 crc kubenswrapper[4676]: I1204 15:32:13.382268 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vc8st" podStartSLOduration=2.606046594 podStartE2EDuration="44.382247832s" podCreationTimestamp="2025-12-04 15:31:29 +0000 UTC" firstStartedPulling="2025-12-04 15:31:30.830694323 +0000 UTC m=+698.265364180" lastFinishedPulling="2025-12-04 15:32:12.606895561 +0000 UTC m=+740.041565418" observedRunningTime="2025-12-04 15:32:13.378631939 +0000 UTC m=+740.813301826" watchObservedRunningTime="2025-12-04 15:32:13.382247832 +0000 UTC m=+740.816917689" Dec 04 15:32:16 crc kubenswrapper[4676]: I1204 15:32:16.026843 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:32:16 crc kubenswrapper[4676]: I1204 15:32:16.027258 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:32:16 crc kubenswrapper[4676]: I1204 15:32:16.361263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" event={"ID":"ff573696-bc37-470b-a8b6-14c5218baa8f","Type":"ContainerStarted","Data":"fb77391add9f981c043d6e92faaec9679f0459d03c32dba04bab9a9a8bb144ee"} Dec 04 15:32:16 crc kubenswrapper[4676]: I1204 15:32:16.363090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" event={"ID":"6d99fa21-223e-4928-a32c-52a3ccbd69d4","Type":"ContainerStarted","Data":"a3d76c1223781073e3593bae6e6ce4ad5d278fda3510b0bf1a51ae8c77c5ecf0"} Dec 04 15:32:16 crc kubenswrapper[4676]: I1204 15:32:16.383354 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl" podStartSLOduration=-9223371989.471453 podStartE2EDuration="47.38332288s" podCreationTimestamp="2025-12-04 15:31:29 +0000 UTC" firstStartedPulling="2025-12-04 15:31:30.396140865 +0000 UTC m=+697.830810722" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:32:16.381647742 +0000 UTC m=+743.816317599" watchObservedRunningTime="2025-12-04 15:32:16.38332288 +0000 UTC m=+743.817992737" Dec 04 15:32:16 crc kubenswrapper[4676]: I1204 15:32:16.422703 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx" podStartSLOduration=3.197938349 podStartE2EDuration="47.422675088s" podCreationTimestamp="2025-12-04 15:31:29 +0000 UTC" firstStartedPulling="2025-12-04 15:31:30.948391165 +0000 UTC m=+698.383061012" lastFinishedPulling="2025-12-04 15:32:15.173127894 +0000 UTC m=+742.607797751" observedRunningTime="2025-12-04 15:32:16.417974163 +0000 UTC m=+743.852644020" watchObservedRunningTime="2025-12-04 15:32:16.422675088 +0000 UTC m=+743.857344945" Dec 04 15:32:18 crc kubenswrapper[4676]: I1204 15:32:18.376345 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" event={"ID":"88a9075d-6a0a-4172-b28c-979ad7fff84b","Type":"ContainerStarted","Data":"dc724c6ad19c113ef66fa81a9e4693b207ae8037b34327df4b50a362e96bdaa4"} Dec 04 15:32:18 crc kubenswrapper[4676]: I1204 15:32:18.376991 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:32:18 crc kubenswrapper[4676]: I1204 15:32:18.379291 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" Dec 04 15:32:18 crc kubenswrapper[4676]: I1204 15:32:18.398609 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-fl2ph" podStartSLOduration=2.7453973769999998 podStartE2EDuration="49.39858771s" podCreationTimestamp="2025-12-04 15:31:29 +0000 UTC" firstStartedPulling="2025-12-04 15:31:30.644821859 +0000 UTC m=+698.079491716" lastFinishedPulling="2025-12-04 15:32:17.298012192 +0000 UTC m=+744.732682049" observedRunningTime="2025-12-04 15:32:18.397624462 +0000 UTC m=+745.832294349" watchObservedRunningTime="2025-12-04 15:32:18.39858771 +0000 UTC m=+745.833257567" Dec 04 15:32:20 crc kubenswrapper[4676]: I1204 15:32:20.560217 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-lxcjx" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.287548 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6"] Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.289877 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.292282 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.338842 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6"] Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.429081 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn5kc\" (UniqueName: \"kubernetes.io/projected/d43defed-8b48-4daa-83b5-3b44b845c0d8-kube-api-access-pn5kc\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.429187 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.429289 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.530186 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.530326 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn5kc\" (UniqueName: \"kubernetes.io/projected/d43defed-8b48-4daa-83b5-3b44b845c0d8-kube-api-access-pn5kc\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.530360 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.530920 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.531026 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.562937 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn5kc\" (UniqueName: \"kubernetes.io/projected/d43defed-8b48-4daa-83b5-3b44b845c0d8-kube-api-access-pn5kc\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.611144 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:39 crc kubenswrapper[4676]: I1204 15:32:39.907412 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6"] Dec 04 15:32:40 crc kubenswrapper[4676]: I1204 15:32:40.724246 4676 generic.go:334] "Generic (PLEG): container finished" podID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerID="c1acff36ba00d190675fe29d27320d19c469b04c84711c2af0393df160696435" exitCode=0 Dec 04 15:32:40 crc kubenswrapper[4676]: I1204 15:32:40.724348 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" event={"ID":"d43defed-8b48-4daa-83b5-3b44b845c0d8","Type":"ContainerDied","Data":"c1acff36ba00d190675fe29d27320d19c469b04c84711c2af0393df160696435"} Dec 04 15:32:40 crc kubenswrapper[4676]: I1204 15:32:40.724491 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" event={"ID":"d43defed-8b48-4daa-83b5-3b44b845c0d8","Type":"ContainerStarted","Data":"9245a343296c20834ae9ea518b2c5c9c4e0000edc2fb31ce5135f0306e15aa44"} Dec 04 15:32:43 crc kubenswrapper[4676]: I1204 15:32:43.752431 4676 generic.go:334] "Generic (PLEG): container finished" podID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerID="e1882ea34c65a52be22989b0f093a7704aa63df258982e000199cd8286f8ef7f" exitCode=0 Dec 04 15:32:43 crc kubenswrapper[4676]: I1204 15:32:43.753031 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" event={"ID":"d43defed-8b48-4daa-83b5-3b44b845c0d8","Type":"ContainerDied","Data":"e1882ea34c65a52be22989b0f093a7704aa63df258982e000199cd8286f8ef7f"} Dec 04 15:32:44 crc kubenswrapper[4676]: I1204 15:32:44.238023 4676 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 04 15:32:44 crc kubenswrapper[4676]: I1204 15:32:44.762326 4676 generic.go:334] "Generic (PLEG): container finished" podID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerID="9a341eb53ad34cd693ad9a81445c838b97cfb5bf1a489a9b0ff4c2a15c2c77a3" exitCode=0 Dec 04 15:32:44 crc kubenswrapper[4676]: I1204 15:32:44.762386 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" event={"ID":"d43defed-8b48-4daa-83b5-3b44b845c0d8","Type":"ContainerDied","Data":"9a341eb53ad34cd693ad9a81445c838b97cfb5bf1a489a9b0ff4c2a15c2c77a3"} Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.020963 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.026822 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.026947 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.027006 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.027875 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fe7a265e00c1d56ac021f0f7b498108db8f42348e6b750a6c0468f9b25973a9"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.028026 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://9fe7a265e00c1d56ac021f0f7b498108db8f42348e6b750a6c0468f9b25973a9" gracePeriod=600 Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.148078 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-bundle\") pod \"d43defed-8b48-4daa-83b5-3b44b845c0d8\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.148304 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-util\") pod \"d43defed-8b48-4daa-83b5-3b44b845c0d8\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.148329 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn5kc\" (UniqueName: \"kubernetes.io/projected/d43defed-8b48-4daa-83b5-3b44b845c0d8-kube-api-access-pn5kc\") pod \"d43defed-8b48-4daa-83b5-3b44b845c0d8\" (UID: \"d43defed-8b48-4daa-83b5-3b44b845c0d8\") " Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.150123 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-bundle" (OuterVolumeSpecName: "bundle") pod "d43defed-8b48-4daa-83b5-3b44b845c0d8" (UID: "d43defed-8b48-4daa-83b5-3b44b845c0d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.154400 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d43defed-8b48-4daa-83b5-3b44b845c0d8-kube-api-access-pn5kc" (OuterVolumeSpecName: "kube-api-access-pn5kc") pod "d43defed-8b48-4daa-83b5-3b44b845c0d8" (UID: "d43defed-8b48-4daa-83b5-3b44b845c0d8"). InnerVolumeSpecName "kube-api-access-pn5kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.159841 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-util" (OuterVolumeSpecName: "util") pod "d43defed-8b48-4daa-83b5-3b44b845c0d8" (UID: "d43defed-8b48-4daa-83b5-3b44b845c0d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.226067 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-888bm"] Dec 04 15:32:46 crc kubenswrapper[4676]: E1204 15:32:46.226639 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerName="util" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.226680 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerName="util" Dec 04 15:32:46 crc kubenswrapper[4676]: E1204 15:32:46.226703 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerName="pull" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.226711 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerName="pull" Dec 04 15:32:46 crc kubenswrapper[4676]: E1204 15:32:46.226730 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerName="extract" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.226741 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerName="extract" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.226886 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d43defed-8b48-4daa-83b5-3b44b845c0d8" containerName="extract" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.227950 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.237779 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-888bm"] Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.249795 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn5kc\" (UniqueName: \"kubernetes.io/projected/d43defed-8b48-4daa-83b5-3b44b845c0d8-kube-api-access-pn5kc\") on node \"crc\" DevicePath \"\"" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.249846 4676 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-util\") on node \"crc\" DevicePath \"\"" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.249886 4676 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d43defed-8b48-4daa-83b5-3b44b845c0d8-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.351213 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgv67\" (UniqueName: \"kubernetes.io/projected/2712b2e1-7313-42f0-8e10-db5e0267a616-kube-api-access-pgv67\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.351289 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-catalog-content\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.351620 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-utilities\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.453176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-utilities\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.453295 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgv67\" (UniqueName: \"kubernetes.io/projected/2712b2e1-7313-42f0-8e10-db5e0267a616-kube-api-access-pgv67\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.453703 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-utilities\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.453764 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-catalog-content\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.454108 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-catalog-content\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.474731 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgv67\" (UniqueName: \"kubernetes.io/projected/2712b2e1-7313-42f0-8e10-db5e0267a616-kube-api-access-pgv67\") pod \"redhat-operators-888bm\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.585586 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.779369 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="9fe7a265e00c1d56ac021f0f7b498108db8f42348e6b750a6c0468f9b25973a9" exitCode=0 Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.779636 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"9fe7a265e00c1d56ac021f0f7b498108db8f42348e6b750a6c0468f9b25973a9"} Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.779664 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"d4e59e979cd83496088e0b3d97a0d08e9a57942e7fa37137c26486dd40de7195"} Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.779701 4676 scope.go:117] "RemoveContainer" containerID="bf1d12a652493590b1041f80cc7bc50696338309137f793248c1e4079ace37ed" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.788919 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" event={"ID":"d43defed-8b48-4daa-83b5-3b44b845c0d8","Type":"ContainerDied","Data":"9245a343296c20834ae9ea518b2c5c9c4e0000edc2fb31ce5135f0306e15aa44"} Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.788964 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9245a343296c20834ae9ea518b2c5c9c4e0000edc2fb31ce5135f0306e15aa44" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.789057 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6" Dec 04 15:32:46 crc kubenswrapper[4676]: I1204 15:32:46.823266 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-888bm"] Dec 04 15:32:46 crc kubenswrapper[4676]: W1204 15:32:46.831560 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2712b2e1_7313_42f0_8e10_db5e0267a616.slice/crio-f0d49b5cf6b1c389c19c923222990d83f6718f08bac7efdaa9837a0dfd8fc4f9 WatchSource:0}: Error finding container f0d49b5cf6b1c389c19c923222990d83f6718f08bac7efdaa9837a0dfd8fc4f9: Status 404 returned error can't find the container with id f0d49b5cf6b1c389c19c923222990d83f6718f08bac7efdaa9837a0dfd8fc4f9 Dec 04 15:32:47 crc kubenswrapper[4676]: I1204 15:32:47.797242 4676 generic.go:334] "Generic (PLEG): container finished" podID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerID="4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b" exitCode=0 Dec 04 15:32:47 crc kubenswrapper[4676]: I1204 15:32:47.797375 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-888bm" event={"ID":"2712b2e1-7313-42f0-8e10-db5e0267a616","Type":"ContainerDied","Data":"4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b"} Dec 04 15:32:47 crc kubenswrapper[4676]: I1204 15:32:47.797738 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-888bm" event={"ID":"2712b2e1-7313-42f0-8e10-db5e0267a616","Type":"ContainerStarted","Data":"f0d49b5cf6b1c389c19c923222990d83f6718f08bac7efdaa9837a0dfd8fc4f9"} Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.740745 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf"] Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.741736 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.748266 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-vmx7q" Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.748265 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.755967 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.765081 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf"] Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.810460 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-888bm" event={"ID":"2712b2e1-7313-42f0-8e10-db5e0267a616","Type":"ContainerStarted","Data":"970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c"} Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.883656 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lbzx\" (UniqueName: \"kubernetes.io/projected/d88f4c5a-fc64-4912-a4c4-7a2af156aa3f-kube-api-access-5lbzx\") pod \"nmstate-operator-5b5b58f5c8-w7zxf\" (UID: \"d88f4c5a-fc64-4912-a4c4-7a2af156aa3f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" Dec 04 15:32:48 crc kubenswrapper[4676]: I1204 15:32:48.985336 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lbzx\" (UniqueName: \"kubernetes.io/projected/d88f4c5a-fc64-4912-a4c4-7a2af156aa3f-kube-api-access-5lbzx\") pod \"nmstate-operator-5b5b58f5c8-w7zxf\" (UID: \"d88f4c5a-fc64-4912-a4c4-7a2af156aa3f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" Dec 04 15:32:49 crc kubenswrapper[4676]: I1204 15:32:49.011244 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lbzx\" (UniqueName: \"kubernetes.io/projected/d88f4c5a-fc64-4912-a4c4-7a2af156aa3f-kube-api-access-5lbzx\") pod \"nmstate-operator-5b5b58f5c8-w7zxf\" (UID: \"d88f4c5a-fc64-4912-a4c4-7a2af156aa3f\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" Dec 04 15:32:49 crc kubenswrapper[4676]: I1204 15:32:49.064011 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" Dec 04 15:32:49 crc kubenswrapper[4676]: I1204 15:32:49.360965 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf"] Dec 04 15:32:49 crc kubenswrapper[4676]: I1204 15:32:49.817518 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" event={"ID":"d88f4c5a-fc64-4912-a4c4-7a2af156aa3f","Type":"ContainerStarted","Data":"bb0dc86d15d32d8b506cf8cd856a6ca7d170f38e0b8e0dfaaf36bea43c29fba3"} Dec 04 15:32:50 crc kubenswrapper[4676]: I1204 15:32:50.826085 4676 generic.go:334] "Generic (PLEG): container finished" podID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerID="970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c" exitCode=0 Dec 04 15:32:50 crc kubenswrapper[4676]: I1204 15:32:50.826129 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-888bm" event={"ID":"2712b2e1-7313-42f0-8e10-db5e0267a616","Type":"ContainerDied","Data":"970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c"} Dec 04 15:32:51 crc kubenswrapper[4676]: I1204 15:32:51.836241 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-888bm" event={"ID":"2712b2e1-7313-42f0-8e10-db5e0267a616","Type":"ContainerStarted","Data":"cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782"} Dec 04 15:32:51 crc kubenswrapper[4676]: I1204 15:32:51.861861 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-888bm" podStartSLOduration=2.458956927 podStartE2EDuration="5.861807739s" podCreationTimestamp="2025-12-04 15:32:46 +0000 UTC" firstStartedPulling="2025-12-04 15:32:47.799420089 +0000 UTC m=+775.234089946" lastFinishedPulling="2025-12-04 15:32:51.202270901 +0000 UTC m=+778.636940758" observedRunningTime="2025-12-04 15:32:51.85663785 +0000 UTC m=+779.291307717" watchObservedRunningTime="2025-12-04 15:32:51.861807739 +0000 UTC m=+779.296477596" Dec 04 15:32:53 crc kubenswrapper[4676]: I1204 15:32:53.864591 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" event={"ID":"d88f4c5a-fc64-4912-a4c4-7a2af156aa3f","Type":"ContainerStarted","Data":"c4a727339343b7776287da2a8433e64d05cb7f5242698d29e980cc1bbaea676b"} Dec 04 15:32:53 crc kubenswrapper[4676]: I1204 15:32:53.884123 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-w7zxf" podStartSLOduration=1.587778094 podStartE2EDuration="5.884097884s" podCreationTimestamp="2025-12-04 15:32:48 +0000 UTC" firstStartedPulling="2025-12-04 15:32:49.378553464 +0000 UTC m=+776.813223321" lastFinishedPulling="2025-12-04 15:32:53.674873254 +0000 UTC m=+781.109543111" observedRunningTime="2025-12-04 15:32:53.881446177 +0000 UTC m=+781.316116034" watchObservedRunningTime="2025-12-04 15:32:53.884097884 +0000 UTC m=+781.318767741" Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.847884 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz"] Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.849676 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.852477 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-kpbsm" Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.860878 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz"] Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.873759 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg"] Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.874739 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.877051 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.893728 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-s57t5"] Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.894882 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.902525 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg"] Dec 04 15:32:54 crc kubenswrapper[4676]: I1204 15:32:54.956284 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9jn9\" (UniqueName: \"kubernetes.io/projected/3a01cabf-b256-487e-840b-db8b85c3de85-kube-api-access-m9jn9\") pod \"nmstate-metrics-7f946cbc9-5tgzz\" (UID: \"3a01cabf-b256-487e-840b-db8b85c3de85\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.009716 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs"] Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.010510 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.014312 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.014473 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.015252 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qcgcg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.023101 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs"] Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.058277 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-ovs-socket\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.058344 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7r5n\" (UniqueName: \"kubernetes.io/projected/8cbb02ff-f891-4887-b834-ba6f1cf7274c-kube-api-access-x7r5n\") pod \"nmstate-webhook-5f6d4c5ccb-8sbbg\" (UID: \"8cbb02ff-f891-4887-b834-ba6f1cf7274c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.058373 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8cbb02ff-f891-4887-b834-ba6f1cf7274c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8sbbg\" (UID: \"8cbb02ff-f891-4887-b834-ba6f1cf7274c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.058401 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-dbus-socket\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.058438 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-nmstate-lock\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.058468 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d565w\" (UniqueName: \"kubernetes.io/projected/fb8265ae-de57-4ac5-9804-d3becd3a48d5-kube-api-access-d565w\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.058516 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9jn9\" (UniqueName: \"kubernetes.io/projected/3a01cabf-b256-487e-840b-db8b85c3de85-kube-api-access-m9jn9\") pod \"nmstate-metrics-7f946cbc9-5tgzz\" (UID: \"3a01cabf-b256-487e-840b-db8b85c3de85\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.083485 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9jn9\" (UniqueName: \"kubernetes.io/projected/3a01cabf-b256-487e-840b-db8b85c3de85-kube-api-access-m9jn9\") pod \"nmstate-metrics-7f946cbc9-5tgzz\" (UID: \"3a01cabf-b256-487e-840b-db8b85c3de85\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160180 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-nmstate-lock\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160255 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d565w\" (UniqueName: \"kubernetes.io/projected/fb8265ae-de57-4ac5-9804-d3becd3a48d5-kube-api-access-d565w\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160293 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c4a94816-54e1-4cde-87cd-130411826243-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160333 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-nmstate-lock\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160366 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhsfz\" (UniqueName: \"kubernetes.io/projected/c4a94816-54e1-4cde-87cd-130411826243-kube-api-access-mhsfz\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160460 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-ovs-socket\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160484 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7r5n\" (UniqueName: \"kubernetes.io/projected/8cbb02ff-f891-4887-b834-ba6f1cf7274c-kube-api-access-x7r5n\") pod \"nmstate-webhook-5f6d4c5ccb-8sbbg\" (UID: \"8cbb02ff-f891-4887-b834-ba6f1cf7274c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160518 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a94816-54e1-4cde-87cd-130411826243-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160547 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8cbb02ff-f891-4887-b834-ba6f1cf7274c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8sbbg\" (UID: \"8cbb02ff-f891-4887-b834-ba6f1cf7274c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160572 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-dbus-socket\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160917 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-dbus-socket\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.160915 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fb8265ae-de57-4ac5-9804-d3becd3a48d5-ovs-socket\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.166299 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/8cbb02ff-f891-4887-b834-ba6f1cf7274c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8sbbg\" (UID: \"8cbb02ff-f891-4887-b834-ba6f1cf7274c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.168233 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.180707 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7r5n\" (UniqueName: \"kubernetes.io/projected/8cbb02ff-f891-4887-b834-ba6f1cf7274c-kube-api-access-x7r5n\") pod \"nmstate-webhook-5f6d4c5ccb-8sbbg\" (UID: \"8cbb02ff-f891-4887-b834-ba6f1cf7274c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.188528 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d565w\" (UniqueName: \"kubernetes.io/projected/fb8265ae-de57-4ac5-9804-d3becd3a48d5-kube-api-access-d565w\") pod \"nmstate-handler-s57t5\" (UID: \"fb8265ae-de57-4ac5-9804-d3becd3a48d5\") " pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.190767 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.211647 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.261628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a94816-54e1-4cde-87cd-130411826243-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.261935 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c4a94816-54e1-4cde-87cd-130411826243-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.261996 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhsfz\" (UniqueName: \"kubernetes.io/projected/c4a94816-54e1-4cde-87cd-130411826243-kube-api-access-mhsfz\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.263497 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c4a94816-54e1-4cde-87cd-130411826243-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.267030 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c4a94816-54e1-4cde-87cd-130411826243-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.282747 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhsfz\" (UniqueName: \"kubernetes.io/projected/c4a94816-54e1-4cde-87cd-130411826243-kube-api-access-mhsfz\") pod \"nmstate-console-plugin-7fbb5f6569-cpjcs\" (UID: \"c4a94816-54e1-4cde-87cd-130411826243\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.326512 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.878457 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s57t5" event={"ID":"fb8265ae-de57-4ac5-9804-d3becd3a48d5","Type":"ContainerStarted","Data":"2df5cc8e12d44f52beae23dec37582b7b966548df0ad700a93bc218fc8692143"} Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.929358 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fdc6c546f-752rv"] Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.934967 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.942882 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fdc6c546f-752rv"] Dec 04 15:32:55 crc kubenswrapper[4676]: I1204 15:32:55.964470 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs"] Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.041297 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg"] Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.073554 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-config\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.073607 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-service-ca\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.073644 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blfkq\" (UniqueName: \"kubernetes.io/projected/82e741bd-45e1-4705-bc6e-48b18bd1b97c-kube-api-access-blfkq\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.073673 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-serving-cert\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.073805 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-oauth-config\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.073957 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-trusted-ca-bundle\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.073992 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-oauth-serving-cert\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.094952 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz"] Dec 04 15:32:56 crc kubenswrapper[4676]: W1204 15:32:56.102636 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a01cabf_b256_487e_840b_db8b85c3de85.slice/crio-6d91ce1a4654fad33848301b2f3ed3bb8d5eabc84febcbe0a9dfbca352f17256 WatchSource:0}: Error finding container 6d91ce1a4654fad33848301b2f3ed3bb8d5eabc84febcbe0a9dfbca352f17256: Status 404 returned error can't find the container with id 6d91ce1a4654fad33848301b2f3ed3bb8d5eabc84febcbe0a9dfbca352f17256 Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.175344 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-oauth-config\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.175621 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-trusted-ca-bundle\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.175736 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-oauth-serving-cert\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.175939 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-config\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.176031 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-service-ca\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.176170 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blfkq\" (UniqueName: \"kubernetes.io/projected/82e741bd-45e1-4705-bc6e-48b18bd1b97c-kube-api-access-blfkq\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.176282 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-serving-cert\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.176867 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-service-ca\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.177081 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-config\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.177212 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-trusted-ca-bundle\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.177379 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82e741bd-45e1-4705-bc6e-48b18bd1b97c-oauth-serving-cert\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.181342 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-serving-cert\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.182956 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82e741bd-45e1-4705-bc6e-48b18bd1b97c-console-oauth-config\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.195707 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blfkq\" (UniqueName: \"kubernetes.io/projected/82e741bd-45e1-4705-bc6e-48b18bd1b97c-kube-api-access-blfkq\") pod \"console-7fdc6c546f-752rv\" (UID: \"82e741bd-45e1-4705-bc6e-48b18bd1b97c\") " pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.254630 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.589492 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.589614 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.674483 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fdc6c546f-752rv"] Dec 04 15:32:56 crc kubenswrapper[4676]: W1204 15:32:56.683648 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e741bd_45e1_4705_bc6e_48b18bd1b97c.slice/crio-d6021083384f236e787f260a1d556d45138c8adf3e773a80d002b5e26f7c6808 WatchSource:0}: Error finding container d6021083384f236e787f260a1d556d45138c8adf3e773a80d002b5e26f7c6808: Status 404 returned error can't find the container with id d6021083384f236e787f260a1d556d45138c8adf3e773a80d002b5e26f7c6808 Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.890978 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" event={"ID":"c4a94816-54e1-4cde-87cd-130411826243","Type":"ContainerStarted","Data":"98a2046f604cea6f0d9cbd24700b50b81fe6305db3f21e25f0f3340b8b632346"} Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.892847 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" event={"ID":"8cbb02ff-f891-4887-b834-ba6f1cf7274c","Type":"ContainerStarted","Data":"a7e13582dd8165fadbea5a9a8e9bae65aac60f8634f8c934e913ed4c81a940d3"} Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.895313 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fdc6c546f-752rv" event={"ID":"82e741bd-45e1-4705-bc6e-48b18bd1b97c","Type":"ContainerStarted","Data":"6c4a89ba301fbb8c3aab3c656eb545462674ba96e22a6d5331d115ffac90047c"} Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.895368 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fdc6c546f-752rv" event={"ID":"82e741bd-45e1-4705-bc6e-48b18bd1b97c","Type":"ContainerStarted","Data":"d6021083384f236e787f260a1d556d45138c8adf3e773a80d002b5e26f7c6808"} Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.899390 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" event={"ID":"3a01cabf-b256-487e-840b-db8b85c3de85","Type":"ContainerStarted","Data":"6d91ce1a4654fad33848301b2f3ed3bb8d5eabc84febcbe0a9dfbca352f17256"} Dec 04 15:32:56 crc kubenswrapper[4676]: I1204 15:32:56.920477 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fdc6c546f-752rv" podStartSLOduration=1.920452895 podStartE2EDuration="1.920452895s" podCreationTimestamp="2025-12-04 15:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:32:56.919509327 +0000 UTC m=+784.354179194" watchObservedRunningTime="2025-12-04 15:32:56.920452895 +0000 UTC m=+784.355122752" Dec 04 15:32:57 crc kubenswrapper[4676]: I1204 15:32:57.804097 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-888bm" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="registry-server" probeResult="failure" output=< Dec 04 15:32:57 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 15:32:57 crc kubenswrapper[4676]: > Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.010023 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" event={"ID":"3a01cabf-b256-487e-840b-db8b85c3de85","Type":"ContainerStarted","Data":"8a0c2e64ce04b34288a4735c2147ce74a350d04d8d028084536372aacce8d271"} Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.011410 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-s57t5" event={"ID":"fb8265ae-de57-4ac5-9804-d3becd3a48d5","Type":"ContainerStarted","Data":"89c224567ce455ae799cdb35c1a1bd5442c93feb99fa4232686a4db250b0b7eb"} Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.011749 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.013386 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" event={"ID":"c4a94816-54e1-4cde-87cd-130411826243","Type":"ContainerStarted","Data":"88d8bd30a2d7c83d061dabf7ba007b0e26e3b0ead57e670bff39f9fb6426ed1a"} Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.015895 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" event={"ID":"8cbb02ff-f891-4887-b834-ba6f1cf7274c","Type":"ContainerStarted","Data":"8b066232fa578c74a2f18a4b4bb4b77d3c64898bebca3cf80fa715e3b5056671"} Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.016198 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.037705 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-s57t5" podStartSLOduration=2.166003771 podStartE2EDuration="10.03768222s" podCreationTimestamp="2025-12-04 15:32:54 +0000 UTC" firstStartedPulling="2025-12-04 15:32:55.237538861 +0000 UTC m=+782.672208728" lastFinishedPulling="2025-12-04 15:33:03.10921732 +0000 UTC m=+790.543887177" observedRunningTime="2025-12-04 15:33:04.030659008 +0000 UTC m=+791.465328885" watchObservedRunningTime="2025-12-04 15:33:04.03768222 +0000 UTC m=+791.472352077" Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.053682 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-cpjcs" podStartSLOduration=2.922019809 podStartE2EDuration="10.05365243s" podCreationTimestamp="2025-12-04 15:32:54 +0000 UTC" firstStartedPulling="2025-12-04 15:32:55.984054076 +0000 UTC m=+783.418723933" lastFinishedPulling="2025-12-04 15:33:03.115686697 +0000 UTC m=+790.550356554" observedRunningTime="2025-12-04 15:33:04.047637227 +0000 UTC m=+791.482307084" watchObservedRunningTime="2025-12-04 15:33:04.05365243 +0000 UTC m=+791.488322287" Dec 04 15:33:04 crc kubenswrapper[4676]: I1204 15:33:04.069761 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" podStartSLOduration=2.981626277 podStartE2EDuration="10.069741734s" podCreationTimestamp="2025-12-04 15:32:54 +0000 UTC" firstStartedPulling="2025-12-04 15:32:56.050632515 +0000 UTC m=+783.485302372" lastFinishedPulling="2025-12-04 15:33:03.138747962 +0000 UTC m=+790.573417829" observedRunningTime="2025-12-04 15:33:04.065487872 +0000 UTC m=+791.500157739" watchObservedRunningTime="2025-12-04 15:33:04.069741734 +0000 UTC m=+791.504411591" Dec 04 15:33:06 crc kubenswrapper[4676]: I1204 15:33:06.255824 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:33:06 crc kubenswrapper[4676]: I1204 15:33:06.256405 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:33:06 crc kubenswrapper[4676]: I1204 15:33:06.388960 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:33:06 crc kubenswrapper[4676]: I1204 15:33:06.627855 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:33:06 crc kubenswrapper[4676]: I1204 15:33:06.679030 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:33:06 crc kubenswrapper[4676]: I1204 15:33:06.865896 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-888bm"] Dec 04 15:33:07 crc kubenswrapper[4676]: I1204 15:33:07.050653 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" event={"ID":"3a01cabf-b256-487e-840b-db8b85c3de85","Type":"ContainerStarted","Data":"cf2965b4ebe7467a77bc9673205854291a55e4e621901194bb00e54745005118"} Dec 04 15:33:07 crc kubenswrapper[4676]: I1204 15:33:07.056216 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7fdc6c546f-752rv" Dec 04 15:33:07 crc kubenswrapper[4676]: I1204 15:33:07.074282 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-5tgzz" podStartSLOduration=2.541324848 podStartE2EDuration="13.074238197s" podCreationTimestamp="2025-12-04 15:32:54 +0000 UTC" firstStartedPulling="2025-12-04 15:32:56.106277169 +0000 UTC m=+783.540947026" lastFinishedPulling="2025-12-04 15:33:06.639190518 +0000 UTC m=+794.073860375" observedRunningTime="2025-12-04 15:33:07.070812598 +0000 UTC m=+794.505482475" watchObservedRunningTime="2025-12-04 15:33:07.074238197 +0000 UTC m=+794.508908054" Dec 04 15:33:07 crc kubenswrapper[4676]: I1204 15:33:07.239339 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mtj84"] Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.057343 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-888bm" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="registry-server" containerID="cri-o://cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782" gracePeriod=2 Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.496817 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.591896 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgv67\" (UniqueName: \"kubernetes.io/projected/2712b2e1-7313-42f0-8e10-db5e0267a616-kube-api-access-pgv67\") pod \"2712b2e1-7313-42f0-8e10-db5e0267a616\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.592109 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-catalog-content\") pod \"2712b2e1-7313-42f0-8e10-db5e0267a616\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.592142 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-utilities\") pod \"2712b2e1-7313-42f0-8e10-db5e0267a616\" (UID: \"2712b2e1-7313-42f0-8e10-db5e0267a616\") " Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.593111 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-utilities" (OuterVolumeSpecName: "utilities") pod "2712b2e1-7313-42f0-8e10-db5e0267a616" (UID: "2712b2e1-7313-42f0-8e10-db5e0267a616"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.597553 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2712b2e1-7313-42f0-8e10-db5e0267a616-kube-api-access-pgv67" (OuterVolumeSpecName: "kube-api-access-pgv67") pod "2712b2e1-7313-42f0-8e10-db5e0267a616" (UID: "2712b2e1-7313-42f0-8e10-db5e0267a616"). InnerVolumeSpecName "kube-api-access-pgv67". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.693621 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.693670 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgv67\" (UniqueName: \"kubernetes.io/projected/2712b2e1-7313-42f0-8e10-db5e0267a616-kube-api-access-pgv67\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.715433 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2712b2e1-7313-42f0-8e10-db5e0267a616" (UID: "2712b2e1-7313-42f0-8e10-db5e0267a616"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:33:08 crc kubenswrapper[4676]: I1204 15:33:08.795093 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2712b2e1-7313-42f0-8e10-db5e0267a616-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.067246 4676 generic.go:334] "Generic (PLEG): container finished" podID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerID="cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782" exitCode=0 Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.067339 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-888bm" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.067339 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-888bm" event={"ID":"2712b2e1-7313-42f0-8e10-db5e0267a616","Type":"ContainerDied","Data":"cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782"} Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.067405 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-888bm" event={"ID":"2712b2e1-7313-42f0-8e10-db5e0267a616","Type":"ContainerDied","Data":"f0d49b5cf6b1c389c19c923222990d83f6718f08bac7efdaa9837a0dfd8fc4f9"} Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.067447 4676 scope.go:117] "RemoveContainer" containerID="cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.084176 4676 scope.go:117] "RemoveContainer" containerID="970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.102876 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-888bm"] Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.104234 4676 scope.go:117] "RemoveContainer" containerID="4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.106380 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-888bm"] Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.122788 4676 scope.go:117] "RemoveContainer" containerID="cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782" Dec 04 15:33:09 crc kubenswrapper[4676]: E1204 15:33:09.123421 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782\": container with ID starting with cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782 not found: ID does not exist" containerID="cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.123473 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782"} err="failed to get container status \"cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782\": rpc error: code = NotFound desc = could not find container \"cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782\": container with ID starting with cc1a953069858ee630555e7ac81a8096b74821e937d57f63ac9c0baca525b782 not found: ID does not exist" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.123502 4676 scope.go:117] "RemoveContainer" containerID="970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c" Dec 04 15:33:09 crc kubenswrapper[4676]: E1204 15:33:09.123963 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c\": container with ID starting with 970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c not found: ID does not exist" containerID="970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.124017 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c"} err="failed to get container status \"970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c\": rpc error: code = NotFound desc = could not find container \"970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c\": container with ID starting with 970abf4255415c417cc51acbe45f9cc7df023c09eaf6991f791a9ee740aaa04c not found: ID does not exist" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.124050 4676 scope.go:117] "RemoveContainer" containerID="4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b" Dec 04 15:33:09 crc kubenswrapper[4676]: E1204 15:33:09.124436 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b\": container with ID starting with 4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b not found: ID does not exist" containerID="4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.124480 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b"} err="failed to get container status \"4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b\": rpc error: code = NotFound desc = could not find container \"4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b\": container with ID starting with 4315bcc125017d4d9e3e72037123a95c95c4630ac84ad4d4ef933f4a5acdb52b not found: ID does not exist" Dec 04 15:33:09 crc kubenswrapper[4676]: I1204 15:33:09.393655 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" path="/var/lib/kubelet/pods/2712b2e1-7313-42f0-8e10-db5e0267a616/volumes" Dec 04 15:33:10 crc kubenswrapper[4676]: I1204 15:33:10.233686 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-s57t5" Dec 04 15:33:15 crc kubenswrapper[4676]: I1204 15:33:15.200638 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8sbbg" Dec 04 15:33:28 crc kubenswrapper[4676]: I1204 15:33:28.998389 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4"] Dec 04 15:33:29 crc kubenswrapper[4676]: E1204 15:33:28.999314 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="extract-utilities" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:28.999345 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="extract-utilities" Dec 04 15:33:29 crc kubenswrapper[4676]: E1204 15:33:28.999374 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="registry-server" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:28.999383 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="registry-server" Dec 04 15:33:29 crc kubenswrapper[4676]: E1204 15:33:28.999394 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="extract-content" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:28.999403 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="extract-content" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:28.999574 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2712b2e1-7313-42f0-8e10-db5e0267a616" containerName="registry-server" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.001052 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.003674 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.007693 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4"] Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.086363 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.086744 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pw8s\" (UniqueName: \"kubernetes.io/projected/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-kube-api-access-7pw8s\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.086794 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.188128 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.188178 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pw8s\" (UniqueName: \"kubernetes.io/projected/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-kube-api-access-7pw8s\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.188250 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.188749 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.188963 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.210833 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pw8s\" (UniqueName: \"kubernetes.io/projected/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-kube-api-access-7pw8s\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.318415 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:29 crc kubenswrapper[4676]: I1204 15:33:29.746798 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4"] Dec 04 15:33:30 crc kubenswrapper[4676]: I1204 15:33:30.323117 4676 generic.go:334] "Generic (PLEG): container finished" podID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerID="2604406b6834e08f2730c630d178aa5ef3836756b846b1a0bf1c5b4eeb2e8518" exitCode=0 Dec 04 15:33:30 crc kubenswrapper[4676]: I1204 15:33:30.323179 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" event={"ID":"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6","Type":"ContainerDied","Data":"2604406b6834e08f2730c630d178aa5ef3836756b846b1a0bf1c5b4eeb2e8518"} Dec 04 15:33:30 crc kubenswrapper[4676]: I1204 15:33:30.323213 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" event={"ID":"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6","Type":"ContainerStarted","Data":"d62089dffc5fb8d675e3aa944b29effe2cef9d8f1bf223f9c778bf6614f21c8b"} Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.286452 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-mtj84" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerName="console" containerID="cri-o://a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715" gracePeriod=15 Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.736653 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mtj84_0bf416c7-7121-4ca9-8a52-9cbb0d4dc362/console/0.log" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.737078 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.914673 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-oauth-config\") pod \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.914851 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-service-ca\") pod \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.914889 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87t45\" (UniqueName: \"kubernetes.io/projected/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-kube-api-access-87t45\") pod \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.914949 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-oauth-serving-cert\") pod \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.914971 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-serving-cert\") pod \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.915003 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-config\") pod \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.915025 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-trusted-ca-bundle\") pod \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\" (UID: \"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362\") " Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.916031 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-service-ca" (OuterVolumeSpecName: "service-ca") pod "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" (UID: "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.916041 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" (UID: "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.916041 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-config" (OuterVolumeSpecName: "console-config") pod "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" (UID: "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.916102 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" (UID: "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.921706 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" (UID: "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.921849 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-kube-api-access-87t45" (OuterVolumeSpecName: "kube-api-access-87t45") pod "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" (UID: "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362"). InnerVolumeSpecName "kube-api-access-87t45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:33:32 crc kubenswrapper[4676]: I1204 15:33:32.923626 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" (UID: "0bf416c7-7121-4ca9-8a52-9cbb0d4dc362"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.016438 4676 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.016495 4676 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-service-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.016508 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87t45\" (UniqueName: \"kubernetes.io/projected/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-kube-api-access-87t45\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.016520 4676 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.016533 4676 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.016543 4676 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-console-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.016553 4676 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.341453 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-mtj84_0bf416c7-7121-4ca9-8a52-9cbb0d4dc362/console/0.log" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.341507 4676 generic.go:334] "Generic (PLEG): container finished" podID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerID="a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715" exitCode=2 Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.341551 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mtj84" event={"ID":"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362","Type":"ContainerDied","Data":"a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715"} Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.341577 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-mtj84" event={"ID":"0bf416c7-7121-4ca9-8a52-9cbb0d4dc362","Type":"ContainerDied","Data":"bf8fecebf4d575dfd03e57f6a6aa298c07db2809d2bd4ae54a626d6bae980cd7"} Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.341590 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-mtj84" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.341607 4676 scope.go:117] "RemoveContainer" containerID="a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.373890 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-mtj84"] Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.504466 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-mtj84"] Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.554417 4676 scope.go:117] "RemoveContainer" containerID="a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715" Dec 04 15:33:33 crc kubenswrapper[4676]: E1204 15:33:33.554968 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715\": container with ID starting with a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715 not found: ID does not exist" containerID="a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715" Dec 04 15:33:33 crc kubenswrapper[4676]: I1204 15:33:33.555098 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715"} err="failed to get container status \"a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715\": rpc error: code = NotFound desc = could not find container \"a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715\": container with ID starting with a8ca78924a623958a8d324fba96a5ef251f327c7f0198d0d856eb47318dac715 not found: ID does not exist" Dec 04 15:33:35 crc kubenswrapper[4676]: I1204 15:33:35.401984 4676 generic.go:334] "Generic (PLEG): container finished" podID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerID="b4486725c11f6d51dba145e399ac5d6b051be0261bdce47f0f5faf28c6f34731" exitCode=0 Dec 04 15:33:35 crc kubenswrapper[4676]: I1204 15:33:35.413376 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" path="/var/lib/kubelet/pods/0bf416c7-7121-4ca9-8a52-9cbb0d4dc362/volumes" Dec 04 15:33:35 crc kubenswrapper[4676]: I1204 15:33:35.413850 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" event={"ID":"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6","Type":"ContainerDied","Data":"b4486725c11f6d51dba145e399ac5d6b051be0261bdce47f0f5faf28c6f34731"} Dec 04 15:33:36 crc kubenswrapper[4676]: I1204 15:33:36.411752 4676 generic.go:334] "Generic (PLEG): container finished" podID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerID="60745b6f596b3f4965f7dcdc1f8fe755676e7d2736781b8f46b73c4de073b1ac" exitCode=0 Dec 04 15:33:36 crc kubenswrapper[4676]: I1204 15:33:36.411876 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" event={"ID":"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6","Type":"ContainerDied","Data":"60745b6f596b3f4965f7dcdc1f8fe755676e7d2736781b8f46b73c4de073b1ac"} Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.663648 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.787894 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pw8s\" (UniqueName: \"kubernetes.io/projected/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-kube-api-access-7pw8s\") pod \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.787993 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-util\") pod \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.788090 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-bundle\") pod \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\" (UID: \"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6\") " Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.789163 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-bundle" (OuterVolumeSpecName: "bundle") pod "c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" (UID: "c1aa4cb1-4632-4d55-a604-7a1a853ba9c6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.797171 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-kube-api-access-7pw8s" (OuterVolumeSpecName: "kube-api-access-7pw8s") pod "c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" (UID: "c1aa4cb1-4632-4d55-a604-7a1a853ba9c6"). InnerVolumeSpecName "kube-api-access-7pw8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.803788 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-util" (OuterVolumeSpecName: "util") pod "c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" (UID: "c1aa4cb1-4632-4d55-a604-7a1a853ba9c6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.889447 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pw8s\" (UniqueName: \"kubernetes.io/projected/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-kube-api-access-7pw8s\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.889499 4676 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-util\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:37 crc kubenswrapper[4676]: I1204 15:33:37.889512 4676 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1aa4cb1-4632-4d55-a604-7a1a853ba9c6-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:33:38 crc kubenswrapper[4676]: I1204 15:33:38.426841 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" event={"ID":"c1aa4cb1-4632-4d55-a604-7a1a853ba9c6","Type":"ContainerDied","Data":"d62089dffc5fb8d675e3aa944b29effe2cef9d8f1bf223f9c778bf6614f21c8b"} Dec 04 15:33:38 crc kubenswrapper[4676]: I1204 15:33:38.426894 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d62089dffc5fb8d675e3aa944b29effe2cef9d8f1bf223f9c778bf6614f21c8b" Dec 04 15:33:38 crc kubenswrapper[4676]: I1204 15:33:38.426954 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.525361 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf"] Dec 04 15:33:47 crc kubenswrapper[4676]: E1204 15:33:47.525923 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerName="pull" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.525937 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerName="pull" Dec 04 15:33:47 crc kubenswrapper[4676]: E1204 15:33:47.525953 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerName="util" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.525959 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerName="util" Dec 04 15:33:47 crc kubenswrapper[4676]: E1204 15:33:47.525966 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerName="extract" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.525972 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerName="extract" Dec 04 15:33:47 crc kubenswrapper[4676]: E1204 15:33:47.525980 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerName="console" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.525986 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerName="console" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.526103 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1aa4cb1-4632-4d55-a604-7a1a853ba9c6" containerName="extract" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.526113 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf416c7-7121-4ca9-8a52-9cbb0d4dc362" containerName="console" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.526542 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.528293 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.528494 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.528821 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.529069 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-m6kwk" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.529764 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.541252 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf"] Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.674173 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2b1da94-9d99-4645-af39-b9429c50896e-webhook-cert\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.674226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2b1da94-9d99-4645-af39-b9429c50896e-apiservice-cert\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.674445 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnblv\" (UniqueName: \"kubernetes.io/projected/e2b1da94-9d99-4645-af39-b9429c50896e-kube-api-access-qnblv\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.775687 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2b1da94-9d99-4645-af39-b9429c50896e-webhook-cert\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.775758 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2b1da94-9d99-4645-af39-b9429c50896e-apiservice-cert\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.775812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnblv\" (UniqueName: \"kubernetes.io/projected/e2b1da94-9d99-4645-af39-b9429c50896e-kube-api-access-qnblv\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.783671 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e2b1da94-9d99-4645-af39-b9429c50896e-apiservice-cert\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.783681 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e2b1da94-9d99-4645-af39-b9429c50896e-webhook-cert\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.820672 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnblv\" (UniqueName: \"kubernetes.io/projected/e2b1da94-9d99-4645-af39-b9429c50896e-kube-api-access-qnblv\") pod \"metallb-operator-controller-manager-6d9899ddf8-t2gzf\" (UID: \"e2b1da94-9d99-4645-af39-b9429c50896e\") " pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.843474 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.962047 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2"] Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.963108 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.965484 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-zpm8s" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.965688 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.970171 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 15:33:47 crc kubenswrapper[4676]: I1204 15:33:47.986142 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2"] Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.079757 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq57t\" (UniqueName: \"kubernetes.io/projected/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-kube-api-access-qq57t\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.080113 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-apiservice-cert\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.080148 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-webhook-cert\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.181320 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq57t\" (UniqueName: \"kubernetes.io/projected/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-kube-api-access-qq57t\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.181409 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-apiservice-cert\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.181450 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-webhook-cert\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.204932 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-apiservice-cert\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.205469 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-webhook-cert\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.211840 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq57t\" (UniqueName: \"kubernetes.io/projected/c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443-kube-api-access-qq57t\") pod \"metallb-operator-webhook-server-55ff4bc57f-ctsr2\" (UID: \"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443\") " pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.249698 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf"] Dec 04 15:33:48 crc kubenswrapper[4676]: W1204 15:33:48.256524 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b1da94_9d99_4645_af39_b9429c50896e.slice/crio-81cccac2f5d39f629d62b6aaa4908e8783e6e0db69dc31d9a7d8d6257a526665 WatchSource:0}: Error finding container 81cccac2f5d39f629d62b6aaa4908e8783e6e0db69dc31d9a7d8d6257a526665: Status 404 returned error can't find the container with id 81cccac2f5d39f629d62b6aaa4908e8783e6e0db69dc31d9a7d8d6257a526665 Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.293069 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.546245 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2"] Dec 04 15:33:48 crc kubenswrapper[4676]: W1204 15:33:48.559401 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5df83ac_ab2b_4fbb_8f48_f8e2c7eca443.slice/crio-3d6bff39e0d76f19317edfc09f4b0a6b9328211fa46239377f131be99664e297 WatchSource:0}: Error finding container 3d6bff39e0d76f19317edfc09f4b0a6b9328211fa46239377f131be99664e297: Status 404 returned error can't find the container with id 3d6bff39e0d76f19317edfc09f4b0a6b9328211fa46239377f131be99664e297 Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.678037 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" event={"ID":"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443","Type":"ContainerStarted","Data":"3d6bff39e0d76f19317edfc09f4b0a6b9328211fa46239377f131be99664e297"} Dec 04 15:33:48 crc kubenswrapper[4676]: I1204 15:33:48.679131 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" event={"ID":"e2b1da94-9d99-4645-af39-b9429c50896e","Type":"ContainerStarted","Data":"81cccac2f5d39f629d62b6aaa4908e8783e6e0db69dc31d9a7d8d6257a526665"} Dec 04 15:33:51 crc kubenswrapper[4676]: I1204 15:33:51.709693 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" event={"ID":"e2b1da94-9d99-4645-af39-b9429c50896e","Type":"ContainerStarted","Data":"81da52098dc4b12d9f4b6e81e34dedf1e277cccd534c85b75a80d56f340d27c1"} Dec 04 15:33:52 crc kubenswrapper[4676]: I1204 15:33:52.717825 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:33:53 crc kubenswrapper[4676]: I1204 15:33:53.413601 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" podStartSLOduration=3.107110649 podStartE2EDuration="6.413490361s" podCreationTimestamp="2025-12-04 15:33:47 +0000 UTC" firstStartedPulling="2025-12-04 15:33:48.258643864 +0000 UTC m=+835.693313721" lastFinishedPulling="2025-12-04 15:33:51.565023576 +0000 UTC m=+838.999693433" observedRunningTime="2025-12-04 15:33:52.741241975 +0000 UTC m=+840.175911832" watchObservedRunningTime="2025-12-04 15:33:53.413490361 +0000 UTC m=+840.848160208" Dec 04 15:33:53 crc kubenswrapper[4676]: I1204 15:33:53.725259 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" event={"ID":"c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443","Type":"ContainerStarted","Data":"ef92b014efcbe372610dfc4254bfe662191ccf1fc6692547665c69beb1d3a5b3"} Dec 04 15:33:53 crc kubenswrapper[4676]: I1204 15:33:53.752215 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" podStartSLOduration=1.919212082 podStartE2EDuration="6.752192952s" podCreationTimestamp="2025-12-04 15:33:47 +0000 UTC" firstStartedPulling="2025-12-04 15:33:48.565052615 +0000 UTC m=+835.999722472" lastFinishedPulling="2025-12-04 15:33:53.398033485 +0000 UTC m=+840.832703342" observedRunningTime="2025-12-04 15:33:53.748961639 +0000 UTC m=+841.183631516" watchObservedRunningTime="2025-12-04 15:33:53.752192952 +0000 UTC m=+841.186862819" Dec 04 15:33:54 crc kubenswrapper[4676]: I1204 15:33:54.732208 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:34:08 crc kubenswrapper[4676]: I1204 15:34:08.299688 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-55ff4bc57f-ctsr2" Dec 04 15:34:27 crc kubenswrapper[4676]: I1204 15:34:27.846370 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d9899ddf8-t2gzf" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.782204 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r4r27"] Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.792463 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv"] Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.792627 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.795073 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.803042 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.803397 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.803490 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wmvxr" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.803529 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.811479 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv"] Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.874950 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pn92f"] Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.876150 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pn92f" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.880204 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.880468 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.880583 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-5f8vq" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.880686 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.902708 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-5p58x"] Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903510 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpltl\" (UniqueName: \"kubernetes.io/projected/e0d02430-19e7-4515-ac98-59549551ec90-kube-api-access-gpltl\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903590 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903590 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/652c71f4-1df3-45cb-9540-fac675f8134f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7g2tv\" (UID: \"652c71f4-1df3-45cb-9540-fac675f8134f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903635 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-frr-sockets\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903674 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d02430-19e7-4515-ac98-59549551ec90-metrics-certs\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903697 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf74m\" (UniqueName: \"kubernetes.io/projected/652c71f4-1df3-45cb-9540-fac675f8134f-kube-api-access-sf74m\") pod \"frr-k8s-webhook-server-7fcb986d4-7g2tv\" (UID: \"652c71f4-1df3-45cb-9540-fac675f8134f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903729 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-metrics\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903777 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-frr-conf\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903810 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-reloader\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.903832 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e0d02430-19e7-4515-ac98-59549551ec90-frr-startup\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.905917 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 04 15:34:28 crc kubenswrapper[4676]: I1204 15:34:28.913793 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-5p58x"] Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.005549 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-metrics-certs\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.005687 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-frr-conf\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.005765 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-reloader\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.005789 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e0d02430-19e7-4515-ac98-59549551ec90-frr-startup\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.005815 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-metrics-certs\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.005857 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.005969 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpltl\" (UniqueName: \"kubernetes.io/projected/e0d02430-19e7-4515-ac98-59549551ec90-kube-api-access-gpltl\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006041 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznrz\" (UniqueName: \"kubernetes.io/projected/a4165e19-a60f-458e-904c-9092df340dd0-kube-api-access-vznrz\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006100 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/652c71f4-1df3-45cb-9540-fac675f8134f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7g2tv\" (UID: \"652c71f4-1df3-45cb-9540-fac675f8134f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006133 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a4165e19-a60f-458e-904c-9092df340dd0-metallb-excludel2\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006218 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-frr-sockets\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006286 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8llw\" (UniqueName: \"kubernetes.io/projected/4d025efd-41d1-4aa2-8bdf-348a4e378082-kube-api-access-p8llw\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006319 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-cert\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.006362 4676 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006417 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-reloader\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006424 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-frr-conf\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.006447 4676 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.006501 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/652c71f4-1df3-45cb-9540-fac675f8134f-cert podName:652c71f4-1df3-45cb-9540-fac675f8134f nodeName:}" failed. No retries permitted until 2025-12-04 15:34:29.506454066 +0000 UTC m=+876.941123923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/652c71f4-1df3-45cb-9540-fac675f8134f-cert") pod "frr-k8s-webhook-server-7fcb986d4-7g2tv" (UID: "652c71f4-1df3-45cb-9540-fac675f8134f") : secret "frr-k8s-webhook-server-cert" not found Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006370 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d02430-19e7-4515-ac98-59549551ec90-metrics-certs\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.006559 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d02430-19e7-4515-ac98-59549551ec90-metrics-certs podName:e0d02430-19e7-4515-ac98-59549551ec90 nodeName:}" failed. No retries permitted until 2025-12-04 15:34:29.506531638 +0000 UTC m=+876.941201495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0d02430-19e7-4515-ac98-59549551ec90-metrics-certs") pod "frr-k8s-r4r27" (UID: "e0d02430-19e7-4515-ac98-59549551ec90") : secret "frr-k8s-certs-secret" not found Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006587 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-frr-sockets\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006600 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf74m\" (UniqueName: \"kubernetes.io/projected/652c71f4-1df3-45cb-9540-fac675f8134f-kube-api-access-sf74m\") pod \"frr-k8s-webhook-server-7fcb986d4-7g2tv\" (UID: \"652c71f4-1df3-45cb-9540-fac675f8134f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-metrics\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006957 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/e0d02430-19e7-4515-ac98-59549551ec90-metrics\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.006998 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/e0d02430-19e7-4515-ac98-59549551ec90-frr-startup\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.130979 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vznrz\" (UniqueName: \"kubernetes.io/projected/a4165e19-a60f-458e-904c-9092df340dd0-kube-api-access-vznrz\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.131047 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a4165e19-a60f-458e-904c-9092df340dd0-metallb-excludel2\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.131089 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8llw\" (UniqueName: \"kubernetes.io/projected/4d025efd-41d1-4aa2-8bdf-348a4e378082-kube-api-access-p8llw\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.131127 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-cert\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.131176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-metrics-certs\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.131209 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-metrics-certs\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.131227 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.131377 4676 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.131443 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist podName:a4165e19-a60f-458e-904c-9092df340dd0 nodeName:}" failed. No retries permitted until 2025-12-04 15:34:29.631422957 +0000 UTC m=+877.066092814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist") pod "speaker-pn92f" (UID: "a4165e19-a60f-458e-904c-9092df340dd0") : secret "metallb-memberlist" not found Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.132023 4676 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.132097 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-metrics-certs podName:4d025efd-41d1-4aa2-8bdf-348a4e378082 nodeName:}" failed. No retries permitted until 2025-12-04 15:34:29.632078586 +0000 UTC m=+877.066748443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-metrics-certs") pod "controller-f8648f98b-5p58x" (UID: "4d025efd-41d1-4aa2-8bdf-348a4e378082") : secret "controller-certs-secret" not found Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.132524 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a4165e19-a60f-458e-904c-9092df340dd0-metallb-excludel2\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.137001 4676 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.138224 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-metrics-certs\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.147460 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-cert\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.150923 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpltl\" (UniqueName: \"kubernetes.io/projected/e0d02430-19e7-4515-ac98-59549551ec90-kube-api-access-gpltl\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.159272 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznrz\" (UniqueName: \"kubernetes.io/projected/a4165e19-a60f-458e-904c-9092df340dd0-kube-api-access-vznrz\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.159420 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf74m\" (UniqueName: \"kubernetes.io/projected/652c71f4-1df3-45cb-9540-fac675f8134f-kube-api-access-sf74m\") pod \"frr-k8s-webhook-server-7fcb986d4-7g2tv\" (UID: \"652c71f4-1df3-45cb-9540-fac675f8134f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.160459 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8llw\" (UniqueName: \"kubernetes.io/projected/4d025efd-41d1-4aa2-8bdf-348a4e378082-kube-api-access-p8llw\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.536114 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/652c71f4-1df3-45cb-9540-fac675f8134f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7g2tv\" (UID: \"652c71f4-1df3-45cb-9540-fac675f8134f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.536198 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d02430-19e7-4515-ac98-59549551ec90-metrics-certs\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.540448 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/652c71f4-1df3-45cb-9540-fac675f8134f-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7g2tv\" (UID: \"652c71f4-1df3-45cb-9540-fac675f8134f\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.541340 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0d02430-19e7-4515-ac98-59549551ec90-metrics-certs\") pod \"frr-k8s-r4r27\" (UID: \"e0d02430-19e7-4515-ac98-59549551ec90\") " pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.637368 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-metrics-certs\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.637729 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.637924 4676 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 04 15:34:29 crc kubenswrapper[4676]: E1204 15:34:29.638013 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist podName:a4165e19-a60f-458e-904c-9092df340dd0 nodeName:}" failed. No retries permitted until 2025-12-04 15:34:30.637991257 +0000 UTC m=+878.072661114 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist") pod "speaker-pn92f" (UID: "a4165e19-a60f-458e-904c-9092df340dd0") : secret "metallb-memberlist" not found Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.640978 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d025efd-41d1-4aa2-8bdf-348a4e378082-metrics-certs\") pod \"controller-f8648f98b-5p58x\" (UID: \"4d025efd-41d1-4aa2-8bdf-348a4e378082\") " pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.721280 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.731121 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:29 crc kubenswrapper[4676]: I1204 15:34:29.822345 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:30 crc kubenswrapper[4676]: I1204 15:34:30.066687 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv"] Dec 04 15:34:30 crc kubenswrapper[4676]: I1204 15:34:30.151075 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" event={"ID":"652c71f4-1df3-45cb-9540-fac675f8134f","Type":"ContainerStarted","Data":"ee000234078dada505f3e9e478a71c251e2c200d91d84cdb4b9d96c1937b3d60"} Dec 04 15:34:30 crc kubenswrapper[4676]: I1204 15:34:30.152112 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerStarted","Data":"572e79a7f7bf0c57e49c98eadc8c380033f4078499b0c04698cd355340b9cade"} Dec 04 15:34:30 crc kubenswrapper[4676]: I1204 15:34:30.171982 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-5p58x"] Dec 04 15:34:30 crc kubenswrapper[4676]: W1204 15:34:30.178735 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d025efd_41d1_4aa2_8bdf_348a4e378082.slice/crio-9e5fcee39e736db34023bf075e24f597a4f37eba5d480a73834cd7ea058b658a WatchSource:0}: Error finding container 9e5fcee39e736db34023bf075e24f597a4f37eba5d480a73834cd7ea058b658a: Status 404 returned error can't find the container with id 9e5fcee39e736db34023bf075e24f597a4f37eba5d480a73834cd7ea058b658a Dec 04 15:34:30 crc kubenswrapper[4676]: I1204 15:34:30.720318 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:30 crc kubenswrapper[4676]: I1204 15:34:30.737228 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a4165e19-a60f-458e-904c-9092df340dd0-memberlist\") pod \"speaker-pn92f\" (UID: \"a4165e19-a60f-458e-904c-9092df340dd0\") " pod="metallb-system/speaker-pn92f" Dec 04 15:34:30 crc kubenswrapper[4676]: I1204 15:34:30.994235 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pn92f" Dec 04 15:34:31 crc kubenswrapper[4676]: W1204 15:34:31.016286 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4165e19_a60f_458e_904c_9092df340dd0.slice/crio-01599d103cfab59759c24146cd1fcce20b32facc3be3f342ec7943eadbdcd888 WatchSource:0}: Error finding container 01599d103cfab59759c24146cd1fcce20b32facc3be3f342ec7943eadbdcd888: Status 404 returned error can't find the container with id 01599d103cfab59759c24146cd1fcce20b32facc3be3f342ec7943eadbdcd888 Dec 04 15:34:31 crc kubenswrapper[4676]: I1204 15:34:31.448130 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pn92f" event={"ID":"a4165e19-a60f-458e-904c-9092df340dd0","Type":"ContainerStarted","Data":"01599d103cfab59759c24146cd1fcce20b32facc3be3f342ec7943eadbdcd888"} Dec 04 15:34:31 crc kubenswrapper[4676]: I1204 15:34:31.460601 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5p58x" event={"ID":"4d025efd-41d1-4aa2-8bdf-348a4e378082","Type":"ContainerStarted","Data":"ce24d4b0f8adf04275eeb7dea02d3abad7b3beaca3a848f4cc31a0b3bf9e886e"} Dec 04 15:34:31 crc kubenswrapper[4676]: I1204 15:34:31.460654 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5p58x" event={"ID":"4d025efd-41d1-4aa2-8bdf-348a4e378082","Type":"ContainerStarted","Data":"ab271afefdff060529a4c8ae403c9ed501b291192a8aeb7dc9d6e1070e8bbc9d"} Dec 04 15:34:31 crc kubenswrapper[4676]: I1204 15:34:31.460665 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-5p58x" event={"ID":"4d025efd-41d1-4aa2-8bdf-348a4e378082","Type":"ContainerStarted","Data":"9e5fcee39e736db34023bf075e24f597a4f37eba5d480a73834cd7ea058b658a"} Dec 04 15:34:31 crc kubenswrapper[4676]: I1204 15:34:31.461029 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:32 crc kubenswrapper[4676]: I1204 15:34:32.586257 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pn92f" event={"ID":"a4165e19-a60f-458e-904c-9092df340dd0","Type":"ContainerStarted","Data":"3cc383934e8301615123ffca26942ea383226f8a870788a949f8a98d13d7d330"} Dec 04 15:34:32 crc kubenswrapper[4676]: I1204 15:34:32.586297 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pn92f" event={"ID":"a4165e19-a60f-458e-904c-9092df340dd0","Type":"ContainerStarted","Data":"9cd623d4c85446794671419913eb915cb70bab9e4d05a2dca38f3606d903d03b"} Dec 04 15:34:32 crc kubenswrapper[4676]: I1204 15:34:32.586331 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pn92f" Dec 04 15:34:32 crc kubenswrapper[4676]: I1204 15:34:32.622774 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-5p58x" podStartSLOduration=4.622740011 podStartE2EDuration="4.622740011s" podCreationTimestamp="2025-12-04 15:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:34:31.490608472 +0000 UTC m=+878.925278329" watchObservedRunningTime="2025-12-04 15:34:32.622740011 +0000 UTC m=+880.057409868" Dec 04 15:34:33 crc kubenswrapper[4676]: I1204 15:34:33.410881 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pn92f" podStartSLOduration=5.410859355 podStartE2EDuration="5.410859355s" podCreationTimestamp="2025-12-04 15:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:34:32.639850994 +0000 UTC m=+880.074520851" watchObservedRunningTime="2025-12-04 15:34:33.410859355 +0000 UTC m=+880.845529212" Dec 04 15:34:34 crc kubenswrapper[4676]: I1204 15:34:34.320411 4676 patch_prober.go:28] interesting pod/dns-default-wk9bw container/dns namespace/openshift-dns: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=kubernetes Dec 04 15:34:34 crc kubenswrapper[4676]: I1204 15:34:34.321262 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-dns/dns-default-wk9bw" podUID="79d432ec-ac07-4516-a0a0-38fc02ec3e80" containerName="dns" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 15:34:44 crc kubenswrapper[4676]: I1204 15:34:44.026498 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" event={"ID":"652c71f4-1df3-45cb-9540-fac675f8134f","Type":"ContainerStarted","Data":"693be3fd3eeae8e7645e91552353622311e758c2c18c6fb136d28f6300e4ff80"} Dec 04 15:34:44 crc kubenswrapper[4676]: I1204 15:34:44.027008 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:34:44 crc kubenswrapper[4676]: I1204 15:34:44.039350 4676 generic.go:334] "Generic (PLEG): container finished" podID="e0d02430-19e7-4515-ac98-59549551ec90" containerID="78f80471aba2c7240accff57482645b98cb32043ae2618c73a8edd053594af26" exitCode=0 Dec 04 15:34:44 crc kubenswrapper[4676]: I1204 15:34:44.039426 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerDied","Data":"78f80471aba2c7240accff57482645b98cb32043ae2618c73a8edd053594af26"} Dec 04 15:34:44 crc kubenswrapper[4676]: I1204 15:34:44.062810 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" podStartSLOduration=2.461483961 podStartE2EDuration="16.062793195s" podCreationTimestamp="2025-12-04 15:34:28 +0000 UTC" firstStartedPulling="2025-12-04 15:34:30.077581937 +0000 UTC m=+877.512251784" lastFinishedPulling="2025-12-04 15:34:43.678891161 +0000 UTC m=+891.113561018" observedRunningTime="2025-12-04 15:34:44.051272963 +0000 UTC m=+891.485942820" watchObservedRunningTime="2025-12-04 15:34:44.062793195 +0000 UTC m=+891.497463052" Dec 04 15:34:45 crc kubenswrapper[4676]: I1204 15:34:45.048203 4676 generic.go:334] "Generic (PLEG): container finished" podID="e0d02430-19e7-4515-ac98-59549551ec90" containerID="4bba39f0239cd6b6e89ef65fc5246e1a32a2670b859ff4af911435b887491900" exitCode=0 Dec 04 15:34:45 crc kubenswrapper[4676]: I1204 15:34:45.048269 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerDied","Data":"4bba39f0239cd6b6e89ef65fc5246e1a32a2670b859ff4af911435b887491900"} Dec 04 15:34:46 crc kubenswrapper[4676]: I1204 15:34:46.027060 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:34:46 crc kubenswrapper[4676]: I1204 15:34:46.027442 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:34:46 crc kubenswrapper[4676]: I1204 15:34:46.057038 4676 generic.go:334] "Generic (PLEG): container finished" podID="e0d02430-19e7-4515-ac98-59549551ec90" containerID="9657304f1eb0c94c2ef0284902ffdc2ba9bc8503bbf2b988826e63138aaacf20" exitCode=0 Dec 04 15:34:46 crc kubenswrapper[4676]: I1204 15:34:46.057083 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerDied","Data":"9657304f1eb0c94c2ef0284902ffdc2ba9bc8503bbf2b988826e63138aaacf20"} Dec 04 15:34:47 crc kubenswrapper[4676]: I1204 15:34:47.208075 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerStarted","Data":"1db0b3dfe67bac55b6c147290d8179e23fa80eb9a5baff0058ab0ae8d7d7c9a9"} Dec 04 15:34:47 crc kubenswrapper[4676]: I1204 15:34:47.208446 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerStarted","Data":"94033267bebf5c47a48657bcbe05f114341e66f852100578da267975d49252a1"} Dec 04 15:34:47 crc kubenswrapper[4676]: I1204 15:34:47.208457 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerStarted","Data":"238967c93bbd76ec4445b7c7147f5809e1c8d641f9f8cbd9ffe3e7375fa82b73"} Dec 04 15:34:47 crc kubenswrapper[4676]: I1204 15:34:47.208465 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerStarted","Data":"33f3e4dbeabe594140f204339326f6ea276af1b3f2fa92df3151d770cca8a536"} Dec 04 15:34:47 crc kubenswrapper[4676]: I1204 15:34:47.208473 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerStarted","Data":"f1a94b5544be22ad4a97f8b6679803433867ad6a02ddca93e9389733b361c027"} Dec 04 15:34:48 crc kubenswrapper[4676]: I1204 15:34:48.220816 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r4r27" event={"ID":"e0d02430-19e7-4515-ac98-59549551ec90","Type":"ContainerStarted","Data":"71e2861d5b5d9b3e1b89499fd151753ce4b0995ca5c3c05a42d2442bceedd08a"} Dec 04 15:34:48 crc kubenswrapper[4676]: I1204 15:34:48.221094 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:48 crc kubenswrapper[4676]: I1204 15:34:48.249159 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r4r27" podStartSLOduration=6.4343279429999996 podStartE2EDuration="20.24913676s" podCreationTimestamp="2025-12-04 15:34:28 +0000 UTC" firstStartedPulling="2025-12-04 15:34:29.881960219 +0000 UTC m=+877.316630086" lastFinishedPulling="2025-12-04 15:34:43.696769046 +0000 UTC m=+891.131438903" observedRunningTime="2025-12-04 15:34:48.244384473 +0000 UTC m=+895.679054360" watchObservedRunningTime="2025-12-04 15:34:48.24913676 +0000 UTC m=+895.683806617" Dec 04 15:34:49 crc kubenswrapper[4676]: I1204 15:34:49.721864 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:49 crc kubenswrapper[4676]: I1204 15:34:49.762132 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:49 crc kubenswrapper[4676]: I1204 15:34:49.826757 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-5p58x" Dec 04 15:34:51 crc kubenswrapper[4676]: I1204 15:34:51.003816 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pn92f" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.080400 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rstfs"] Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.081590 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rstfs" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.091217 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-bmp2b" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.091331 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.091343 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.096032 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rstfs"] Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.185053 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlxlb\" (UniqueName: \"kubernetes.io/projected/1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf-kube-api-access-dlxlb\") pod \"openstack-operator-index-rstfs\" (UID: \"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf\") " pod="openstack-operators/openstack-operator-index-rstfs" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.287719 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlxlb\" (UniqueName: \"kubernetes.io/projected/1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf-kube-api-access-dlxlb\") pod \"openstack-operator-index-rstfs\" (UID: \"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf\") " pod="openstack-operators/openstack-operator-index-rstfs" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.312090 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlxlb\" (UniqueName: \"kubernetes.io/projected/1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf-kube-api-access-dlxlb\") pod \"openstack-operator-index-rstfs\" (UID: \"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf\") " pod="openstack-operators/openstack-operator-index-rstfs" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.398078 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rstfs" Dec 04 15:34:54 crc kubenswrapper[4676]: I1204 15:34:54.836406 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rstfs"] Dec 04 15:34:54 crc kubenswrapper[4676]: W1204 15:34:54.843295 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df23a3c_3e34_4de9_bbe8_8f1805ff2fcf.slice/crio-6f8888cf8fbf42bb25f74fa9b7529653b4293dcbb4e5c2bedc66b7e523e885e8 WatchSource:0}: Error finding container 6f8888cf8fbf42bb25f74fa9b7529653b4293dcbb4e5c2bedc66b7e523e885e8: Status 404 returned error can't find the container with id 6f8888cf8fbf42bb25f74fa9b7529653b4293dcbb4e5c2bedc66b7e523e885e8 Dec 04 15:34:55 crc kubenswrapper[4676]: I1204 15:34:55.279381 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rstfs" event={"ID":"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf","Type":"ContainerStarted","Data":"6f8888cf8fbf42bb25f74fa9b7529653b4293dcbb4e5c2bedc66b7e523e885e8"} Dec 04 15:34:57 crc kubenswrapper[4676]: I1204 15:34:57.461727 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rstfs"] Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.074132 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fbzg5"] Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.075176 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.081875 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fbzg5"] Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.146864 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dst6s\" (UniqueName: \"kubernetes.io/projected/24f18240-bbb2-4c1c-b396-e5d2a6d44514-kube-api-access-dst6s\") pod \"openstack-operator-index-fbzg5\" (UID: \"24f18240-bbb2-4c1c-b396-e5d2a6d44514\") " pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.248687 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dst6s\" (UniqueName: \"kubernetes.io/projected/24f18240-bbb2-4c1c-b396-e5d2a6d44514-kube-api-access-dst6s\") pod \"openstack-operator-index-fbzg5\" (UID: \"24f18240-bbb2-4c1c-b396-e5d2a6d44514\") " pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.266836 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dst6s\" (UniqueName: \"kubernetes.io/projected/24f18240-bbb2-4c1c-b396-e5d2a6d44514-kube-api-access-dst6s\") pod \"openstack-operator-index-fbzg5\" (UID: \"24f18240-bbb2-4c1c-b396-e5d2a6d44514\") " pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.300417 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rstfs" event={"ID":"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf","Type":"ContainerStarted","Data":"f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d"} Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.300573 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rstfs" podUID="1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf" containerName="registry-server" containerID="cri-o://f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d" gracePeriod=2 Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.320515 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rstfs" podStartSLOduration=1.8170638669999999 podStartE2EDuration="4.320494408s" podCreationTimestamp="2025-12-04 15:34:54 +0000 UTC" firstStartedPulling="2025-12-04 15:34:54.846487233 +0000 UTC m=+902.281157090" lastFinishedPulling="2025-12-04 15:34:57.349917774 +0000 UTC m=+904.784587631" observedRunningTime="2025-12-04 15:34:58.314410972 +0000 UTC m=+905.749080829" watchObservedRunningTime="2025-12-04 15:34:58.320494408 +0000 UTC m=+905.755164265" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.405193 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.745748 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rstfs" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.857028 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlxlb\" (UniqueName: \"kubernetes.io/projected/1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf-kube-api-access-dlxlb\") pod \"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf\" (UID: \"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf\") " Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.866747 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf-kube-api-access-dlxlb" (OuterVolumeSpecName: "kube-api-access-dlxlb") pod "1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf" (UID: "1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf"). InnerVolumeSpecName "kube-api-access-dlxlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.869058 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fbzg5"] Dec 04 15:34:58 crc kubenswrapper[4676]: W1204 15:34:58.872235 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24f18240_bbb2_4c1c_b396_e5d2a6d44514.slice/crio-8126e9f0166ddb4a2188a0ae63fea1bba9241a11f3e8efdca37711780fefeeb5 WatchSource:0}: Error finding container 8126e9f0166ddb4a2188a0ae63fea1bba9241a11f3e8efdca37711780fefeeb5: Status 404 returned error can't find the container with id 8126e9f0166ddb4a2188a0ae63fea1bba9241a11f3e8efdca37711780fefeeb5 Dec 04 15:34:58 crc kubenswrapper[4676]: I1204 15:34:58.958766 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlxlb\" (UniqueName: \"kubernetes.io/projected/1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf-kube-api-access-dlxlb\") on node \"crc\" DevicePath \"\"" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.309490 4676 generic.go:334] "Generic (PLEG): container finished" podID="1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf" containerID="f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d" exitCode=0 Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.309577 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rstfs" event={"ID":"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf","Type":"ContainerDied","Data":"f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d"} Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.309610 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rstfs" event={"ID":"1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf","Type":"ContainerDied","Data":"6f8888cf8fbf42bb25f74fa9b7529653b4293dcbb4e5c2bedc66b7e523e885e8"} Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.309625 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rstfs" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.309659 4676 scope.go:117] "RemoveContainer" containerID="f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.311772 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fbzg5" event={"ID":"24f18240-bbb2-4c1c-b396-e5d2a6d44514","Type":"ContainerStarted","Data":"0cf135bda80020d31a7dae044afadfe827338e214fcca51127ef0082f4233be0"} Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.311831 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fbzg5" event={"ID":"24f18240-bbb2-4c1c-b396-e5d2a6d44514","Type":"ContainerStarted","Data":"8126e9f0166ddb4a2188a0ae63fea1bba9241a11f3e8efdca37711780fefeeb5"} Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.327532 4676 scope.go:117] "RemoveContainer" containerID="f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d" Dec 04 15:34:59 crc kubenswrapper[4676]: E1204 15:34:59.327950 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d\": container with ID starting with f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d not found: ID does not exist" containerID="f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.328000 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d"} err="failed to get container status \"f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d\": rpc error: code = NotFound desc = could not find container \"f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d\": container with ID starting with f407fe09ca6befcb2ca1d5acab78bbef589f30d04c65646535119025b32fd22d not found: ID does not exist" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.342800 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fbzg5" podStartSLOduration=1.283296907 podStartE2EDuration="1.342771081s" podCreationTimestamp="2025-12-04 15:34:58 +0000 UTC" firstStartedPulling="2025-12-04 15:34:58.876378809 +0000 UTC m=+906.311048666" lastFinishedPulling="2025-12-04 15:34:58.935852983 +0000 UTC m=+906.370522840" observedRunningTime="2025-12-04 15:34:59.338415076 +0000 UTC m=+906.773084933" watchObservedRunningTime="2025-12-04 15:34:59.342771081 +0000 UTC m=+906.777440938" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.356224 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rstfs"] Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.358457 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rstfs"] Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.394207 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf" path="/var/lib/kubelet/pods/1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf/volumes" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.726796 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r4r27" Dec 04 15:34:59 crc kubenswrapper[4676]: I1204 15:34:59.740540 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7g2tv" Dec 04 15:35:08 crc kubenswrapper[4676]: I1204 15:35:08.415435 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:35:08 crc kubenswrapper[4676]: I1204 15:35:08.417466 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:35:08 crc kubenswrapper[4676]: I1204 15:35:08.446251 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:35:09 crc kubenswrapper[4676]: I1204 15:35:09.459569 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fbzg5" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.905168 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn"] Dec 04 15:35:10 crc kubenswrapper[4676]: E1204 15:35:10.905610 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf" containerName="registry-server" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.905649 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf" containerName="registry-server" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.905893 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df23a3c-3e34-4de9-bbe8-8f1805ff2fcf" containerName="registry-server" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.907305 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.913879 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-h26sb" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.918598 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn"] Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.940124 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksrm\" (UniqueName: \"kubernetes.io/projected/77e9ca65-5ca8-4d5d-8b88-080a95a82529-kube-api-access-4ksrm\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.940200 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-bundle\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:10 crc kubenswrapper[4676]: I1204 15:35:10.940248 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-util\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.040770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-util\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.040868 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksrm\" (UniqueName: \"kubernetes.io/projected/77e9ca65-5ca8-4d5d-8b88-080a95a82529-kube-api-access-4ksrm\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.040948 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-bundle\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.041305 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-util\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.041358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-bundle\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.060599 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksrm\" (UniqueName: \"kubernetes.io/projected/77e9ca65-5ca8-4d5d-8b88-080a95a82529-kube-api-access-4ksrm\") pod \"6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.231028 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:11 crc kubenswrapper[4676]: I1204 15:35:11.723152 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn"] Dec 04 15:35:12 crc kubenswrapper[4676]: I1204 15:35:12.544680 4676 generic.go:334] "Generic (PLEG): container finished" podID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerID="40d6932374b83baa2e4f561083561543af5ff28d0ff842ceecfeaac26e1c16e7" exitCode=0 Dec 04 15:35:12 crc kubenswrapper[4676]: I1204 15:35:12.544748 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" event={"ID":"77e9ca65-5ca8-4d5d-8b88-080a95a82529","Type":"ContainerDied","Data":"40d6932374b83baa2e4f561083561543af5ff28d0ff842ceecfeaac26e1c16e7"} Dec 04 15:35:12 crc kubenswrapper[4676]: I1204 15:35:12.544787 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" event={"ID":"77e9ca65-5ca8-4d5d-8b88-080a95a82529","Type":"ContainerStarted","Data":"a48229393d3919279f549ce6ea8ba8c45da454ed14381a327654c01ab63e1d0a"} Dec 04 15:35:13 crc kubenswrapper[4676]: I1204 15:35:13.554257 4676 generic.go:334] "Generic (PLEG): container finished" podID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerID="57bb8f34bd6cfd1272690dfd34872bca173c849a25902ead846b5039fc872e02" exitCode=0 Dec 04 15:35:13 crc kubenswrapper[4676]: I1204 15:35:13.554315 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" event={"ID":"77e9ca65-5ca8-4d5d-8b88-080a95a82529","Type":"ContainerDied","Data":"57bb8f34bd6cfd1272690dfd34872bca173c849a25902ead846b5039fc872e02"} Dec 04 15:35:14 crc kubenswrapper[4676]: I1204 15:35:14.563239 4676 generic.go:334] "Generic (PLEG): container finished" podID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerID="d49e79a66809879f4ba3435fb4c8f397fe3982ea38808711f15bb084eeee6429" exitCode=0 Dec 04 15:35:14 crc kubenswrapper[4676]: I1204 15:35:14.563292 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" event={"ID":"77e9ca65-5ca8-4d5d-8b88-080a95a82529","Type":"ContainerDied","Data":"d49e79a66809879f4ba3435fb4c8f397fe3982ea38808711f15bb084eeee6429"} Dec 04 15:35:15 crc kubenswrapper[4676]: I1204 15:35:15.840231 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.100864 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ksrm\" (UniqueName: \"kubernetes.io/projected/77e9ca65-5ca8-4d5d-8b88-080a95a82529-kube-api-access-4ksrm\") pod \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.101014 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-util\") pod \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.101083 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-bundle\") pod \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\" (UID: \"77e9ca65-5ca8-4d5d-8b88-080a95a82529\") " Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.102640 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.102725 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.107299 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77e9ca65-5ca8-4d5d-8b88-080a95a82529-kube-api-access-4ksrm" (OuterVolumeSpecName: "kube-api-access-4ksrm") pod "77e9ca65-5ca8-4d5d-8b88-080a95a82529" (UID: "77e9ca65-5ca8-4d5d-8b88-080a95a82529"). InnerVolumeSpecName "kube-api-access-4ksrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.110383 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-bundle" (OuterVolumeSpecName: "bundle") pod "77e9ca65-5ca8-4d5d-8b88-080a95a82529" (UID: "77e9ca65-5ca8-4d5d-8b88-080a95a82529"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.115383 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-util" (OuterVolumeSpecName: "util") pod "77e9ca65-5ca8-4d5d-8b88-080a95a82529" (UID: "77e9ca65-5ca8-4d5d-8b88-080a95a82529"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.203477 4676 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-util\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.203536 4676 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77e9ca65-5ca8-4d5d-8b88-080a95a82529-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.203558 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ksrm\" (UniqueName: \"kubernetes.io/projected/77e9ca65-5ca8-4d5d-8b88-080a95a82529-kube-api-access-4ksrm\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.577177 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" event={"ID":"77e9ca65-5ca8-4d5d-8b88-080a95a82529","Type":"ContainerDied","Data":"a48229393d3919279f549ce6ea8ba8c45da454ed14381a327654c01ab63e1d0a"} Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.577247 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48229393d3919279f549ce6ea8ba8c45da454ed14381a327654c01ab63e1d0a" Dec 04 15:35:16 crc kubenswrapper[4676]: I1204 15:35:16.577266 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.481938 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv"] Dec 04 15:35:23 crc kubenswrapper[4676]: E1204 15:35:23.482728 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerName="extract" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.482742 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerName="extract" Dec 04 15:35:23 crc kubenswrapper[4676]: E1204 15:35:23.482762 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerName="pull" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.482769 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerName="pull" Dec 04 15:35:23 crc kubenswrapper[4676]: E1204 15:35:23.482785 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerName="util" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.482791 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerName="util" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.482901 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="77e9ca65-5ca8-4d5d-8b88-080a95a82529" containerName="extract" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.483706 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.486356 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-ls7h8" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.557602 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv"] Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.650754 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwkb4\" (UniqueName: \"kubernetes.io/projected/27c20c8b-a18c-40e3-a45f-4cf9b1fb4510-kube-api-access-mwkb4\") pod \"openstack-operator-controller-operator-577c877dd7-7ktcv\" (UID: \"27c20c8b-a18c-40e3-a45f-4cf9b1fb4510\") " pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.752116 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwkb4\" (UniqueName: \"kubernetes.io/projected/27c20c8b-a18c-40e3-a45f-4cf9b1fb4510-kube-api-access-mwkb4\") pod \"openstack-operator-controller-operator-577c877dd7-7ktcv\" (UID: \"27c20c8b-a18c-40e3-a45f-4cf9b1fb4510\") " pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.775845 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwkb4\" (UniqueName: \"kubernetes.io/projected/27c20c8b-a18c-40e3-a45f-4cf9b1fb4510-kube-api-access-mwkb4\") pod \"openstack-operator-controller-operator-577c877dd7-7ktcv\" (UID: \"27c20c8b-a18c-40e3-a45f-4cf9b1fb4510\") " pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" Dec 04 15:35:23 crc kubenswrapper[4676]: I1204 15:35:23.801785 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" Dec 04 15:35:24 crc kubenswrapper[4676]: I1204 15:35:24.497255 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv"] Dec 04 15:35:24 crc kubenswrapper[4676]: I1204 15:35:24.630051 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" event={"ID":"27c20c8b-a18c-40e3-a45f-4cf9b1fb4510","Type":"ContainerStarted","Data":"b498bbdc6436a1d599e490c04c80657eba94f0643de3b10d376a945ffa60de46"} Dec 04 15:35:29 crc kubenswrapper[4676]: I1204 15:35:29.661670 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" event={"ID":"27c20c8b-a18c-40e3-a45f-4cf9b1fb4510","Type":"ContainerStarted","Data":"40cde068c50f79dd7e5bb9923cd6059194c32f520d7966066eb149761180074a"} Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.375432 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hpfr9"] Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.390532 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.399879 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpfr9"] Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.585813 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6trmk\" (UniqueName: \"kubernetes.io/projected/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-kube-api-access-6trmk\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.585875 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-utilities\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.585925 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-catalog-content\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.687527 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6trmk\" (UniqueName: \"kubernetes.io/projected/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-kube-api-access-6trmk\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.687600 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-utilities\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.687638 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-catalog-content\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.688183 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-utilities\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.688254 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-catalog-content\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.714983 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6trmk\" (UniqueName: \"kubernetes.io/projected/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-kube-api-access-6trmk\") pod \"community-operators-hpfr9\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:30 crc kubenswrapper[4676]: I1204 15:35:30.722401 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:31 crc kubenswrapper[4676]: I1204 15:35:31.333239 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hpfr9"] Dec 04 15:35:31 crc kubenswrapper[4676]: I1204 15:35:31.677143 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpfr9" event={"ID":"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8","Type":"ContainerStarted","Data":"dc95c6f3683881ecd56a20c195361b1ba1e7762ae6941a8a369b068f2905ebc5"} Dec 04 15:35:32 crc kubenswrapper[4676]: I1204 15:35:32.685564 4676 generic.go:334] "Generic (PLEG): container finished" podID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerID="412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a" exitCode=0 Dec 04 15:35:32 crc kubenswrapper[4676]: I1204 15:35:32.685638 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpfr9" event={"ID":"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8","Type":"ContainerDied","Data":"412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a"} Dec 04 15:35:32 crc kubenswrapper[4676]: I1204 15:35:32.688012 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" event={"ID":"27c20c8b-a18c-40e3-a45f-4cf9b1fb4510","Type":"ContainerStarted","Data":"5b847a27b597ea073b2eed29d9a3ac662a86dd48e0c63fe57de513d86280dfff"} Dec 04 15:35:32 crc kubenswrapper[4676]: I1204 15:35:32.688117 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 15:35:32 crc kubenswrapper[4676]: I1204 15:35:32.688181 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" Dec 04 15:35:32 crc kubenswrapper[4676]: I1204 15:35:32.731981 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" podStartSLOduration=3.168254254 podStartE2EDuration="9.731945811s" podCreationTimestamp="2025-12-04 15:35:23 +0000 UTC" firstStartedPulling="2025-12-04 15:35:24.508616384 +0000 UTC m=+931.943286241" lastFinishedPulling="2025-12-04 15:35:31.072307941 +0000 UTC m=+938.506977798" observedRunningTime="2025-12-04 15:35:32.730287153 +0000 UTC m=+940.164957030" watchObservedRunningTime="2025-12-04 15:35:32.731945811 +0000 UTC m=+940.166615668" Dec 04 15:35:33 crc kubenswrapper[4676]: I1204 15:35:33.697899 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-577c877dd7-7ktcv" Dec 04 15:35:34 crc kubenswrapper[4676]: I1204 15:35:34.702527 4676 generic.go:334] "Generic (PLEG): container finished" podID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerID="d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1" exitCode=0 Dec 04 15:35:34 crc kubenswrapper[4676]: I1204 15:35:34.702644 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpfr9" event={"ID":"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8","Type":"ContainerDied","Data":"d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1"} Dec 04 15:35:36 crc kubenswrapper[4676]: I1204 15:35:36.718529 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpfr9" event={"ID":"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8","Type":"ContainerStarted","Data":"9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1"} Dec 04 15:35:36 crc kubenswrapper[4676]: I1204 15:35:36.743497 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hpfr9" podStartSLOduration=3.754400775 podStartE2EDuration="6.743476359s" podCreationTimestamp="2025-12-04 15:35:30 +0000 UTC" firstStartedPulling="2025-12-04 15:35:32.687808891 +0000 UTC m=+940.122478748" lastFinishedPulling="2025-12-04 15:35:35.676884475 +0000 UTC m=+943.111554332" observedRunningTime="2025-12-04 15:35:36.74107459 +0000 UTC m=+944.175744487" watchObservedRunningTime="2025-12-04 15:35:36.743476359 +0000 UTC m=+944.178146216" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.377619 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5w7r8"] Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.380508 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.382473 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5w7r8"] Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.427601 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-utilities\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.427695 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk47c\" (UniqueName: \"kubernetes.io/projected/395ffe4b-ade5-4326-8a64-03892c41efd7-kube-api-access-qk47c\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.427717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-catalog-content\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.529444 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk47c\" (UniqueName: \"kubernetes.io/projected/395ffe4b-ade5-4326-8a64-03892c41efd7-kube-api-access-qk47c\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.529543 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-catalog-content\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.530157 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-catalog-content\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.530327 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-utilities\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.530674 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-utilities\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.559355 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk47c\" (UniqueName: \"kubernetes.io/projected/395ffe4b-ade5-4326-8a64-03892c41efd7-kube-api-access-qk47c\") pod \"certified-operators-5w7r8\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.699332 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.723300 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.727189 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:40 crc kubenswrapper[4676]: I1204 15:35:40.818624 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:41 crc kubenswrapper[4676]: I1204 15:35:41.240467 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5w7r8"] Dec 04 15:35:41 crc kubenswrapper[4676]: W1204 15:35:41.257630 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395ffe4b_ade5_4326_8a64_03892c41efd7.slice/crio-f06ebcca7f2da07a58dc3ca32dc55b2d914fcf679fecb3d38c5f62586c78d33d WatchSource:0}: Error finding container f06ebcca7f2da07a58dc3ca32dc55b2d914fcf679fecb3d38c5f62586c78d33d: Status 404 returned error can't find the container with id f06ebcca7f2da07a58dc3ca32dc55b2d914fcf679fecb3d38c5f62586c78d33d Dec 04 15:35:41 crc kubenswrapper[4676]: I1204 15:35:41.753639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w7r8" event={"ID":"395ffe4b-ade5-4326-8a64-03892c41efd7","Type":"ContainerStarted","Data":"f06ebcca7f2da07a58dc3ca32dc55b2d914fcf679fecb3d38c5f62586c78d33d"} Dec 04 15:35:41 crc kubenswrapper[4676]: I1204 15:35:41.815536 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:42 crc kubenswrapper[4676]: I1204 15:35:42.763094 4676 generic.go:334] "Generic (PLEG): container finished" podID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerID="f3112355643b17cd89d19da49e71e8d385203467f314b456f18499248a11fdd6" exitCode=0 Dec 04 15:35:42 crc kubenswrapper[4676]: I1204 15:35:42.763186 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w7r8" event={"ID":"395ffe4b-ade5-4326-8a64-03892c41efd7","Type":"ContainerDied","Data":"f3112355643b17cd89d19da49e71e8d385203467f314b456f18499248a11fdd6"} Dec 04 15:35:43 crc kubenswrapper[4676]: I1204 15:35:43.165197 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpfr9"] Dec 04 15:35:43 crc kubenswrapper[4676]: I1204 15:35:43.771865 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hpfr9" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="registry-server" containerID="cri-o://9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1" gracePeriod=2 Dec 04 15:35:43 crc kubenswrapper[4676]: I1204 15:35:43.772957 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w7r8" event={"ID":"395ffe4b-ade5-4326-8a64-03892c41efd7","Type":"ContainerStarted","Data":"e8692287b318a49a22e70347c3b1f2280f5309b031733dfb757d4eb47d1b873b"} Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.161632 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.188391 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-catalog-content\") pod \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.188455 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-utilities\") pod \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.188513 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6trmk\" (UniqueName: \"kubernetes.io/projected/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-kube-api-access-6trmk\") pod \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\" (UID: \"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8\") " Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.189818 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-utilities" (OuterVolumeSpecName: "utilities") pod "568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" (UID: "568a1a12-bb73-456a-bdc5-b8ff5bdd13e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.193858 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-kube-api-access-6trmk" (OuterVolumeSpecName: "kube-api-access-6trmk") pod "568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" (UID: "568a1a12-bb73-456a-bdc5-b8ff5bdd13e8"). InnerVolumeSpecName "kube-api-access-6trmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.241429 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" (UID: "568a1a12-bb73-456a-bdc5-b8ff5bdd13e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.290423 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.290465 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.290480 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6trmk\" (UniqueName: \"kubernetes.io/projected/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8-kube-api-access-6trmk\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.780673 4676 generic.go:334] "Generic (PLEG): container finished" podID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerID="e8692287b318a49a22e70347c3b1f2280f5309b031733dfb757d4eb47d1b873b" exitCode=0 Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.780768 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w7r8" event={"ID":"395ffe4b-ade5-4326-8a64-03892c41efd7","Type":"ContainerDied","Data":"e8692287b318a49a22e70347c3b1f2280f5309b031733dfb757d4eb47d1b873b"} Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.783747 4676 generic.go:334] "Generic (PLEG): container finished" podID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerID="9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1" exitCode=0 Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.783803 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpfr9" event={"ID":"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8","Type":"ContainerDied","Data":"9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1"} Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.783809 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hpfr9" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.783889 4676 scope.go:117] "RemoveContainer" containerID="9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.783847 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hpfr9" event={"ID":"568a1a12-bb73-456a-bdc5-b8ff5bdd13e8","Type":"ContainerDied","Data":"dc95c6f3683881ecd56a20c195361b1ba1e7762ae6941a8a369b068f2905ebc5"} Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.801545 4676 scope.go:117] "RemoveContainer" containerID="d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.817968 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hpfr9"] Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.827431 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hpfr9"] Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.844316 4676 scope.go:117] "RemoveContainer" containerID="412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.861952 4676 scope.go:117] "RemoveContainer" containerID="9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1" Dec 04 15:35:44 crc kubenswrapper[4676]: E1204 15:35:44.862563 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1\": container with ID starting with 9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1 not found: ID does not exist" containerID="9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.862620 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1"} err="failed to get container status \"9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1\": rpc error: code = NotFound desc = could not find container \"9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1\": container with ID starting with 9c1240a5dffd986c5f9ed594207506399fcb7f5d0d95af5ef7b71d5e49a98fe1 not found: ID does not exist" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.862652 4676 scope.go:117] "RemoveContainer" containerID="d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1" Dec 04 15:35:44 crc kubenswrapper[4676]: E1204 15:35:44.863049 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1\": container with ID starting with d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1 not found: ID does not exist" containerID="d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.863105 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1"} err="failed to get container status \"d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1\": rpc error: code = NotFound desc = could not find container \"d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1\": container with ID starting with d58a14f8160240df2968fba323f23f76169874aa5597fb2671aabdfcb5dec7f1 not found: ID does not exist" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.863142 4676 scope.go:117] "RemoveContainer" containerID="412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a" Dec 04 15:35:44 crc kubenswrapper[4676]: E1204 15:35:44.863460 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a\": container with ID starting with 412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a not found: ID does not exist" containerID="412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a" Dec 04 15:35:44 crc kubenswrapper[4676]: I1204 15:35:44.863488 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a"} err="failed to get container status \"412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a\": rpc error: code = NotFound desc = could not find container \"412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a\": container with ID starting with 412d2ce9cb9895b3e87f7cb05f21af3c16e744e22f2af1e63c775fcf807de06a not found: ID does not exist" Dec 04 15:35:45 crc kubenswrapper[4676]: I1204 15:35:45.393204 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" path="/var/lib/kubelet/pods/568a1a12-bb73-456a-bdc5-b8ff5bdd13e8/volumes" Dec 04 15:35:45 crc kubenswrapper[4676]: I1204 15:35:45.794048 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w7r8" event={"ID":"395ffe4b-ade5-4326-8a64-03892c41efd7","Type":"ContainerStarted","Data":"ad5cfbd980478aebe9461e59887a8a786b41bd492e87e9c2f3011a9dbd6c80bf"} Dec 04 15:35:45 crc kubenswrapper[4676]: I1204 15:35:45.827642 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5w7r8" podStartSLOduration=3.2836534410000002 podStartE2EDuration="5.827618123s" podCreationTimestamp="2025-12-04 15:35:40 +0000 UTC" firstStartedPulling="2025-12-04 15:35:42.765016475 +0000 UTC m=+950.199686332" lastFinishedPulling="2025-12-04 15:35:45.308981157 +0000 UTC m=+952.743651014" observedRunningTime="2025-12-04 15:35:45.819547721 +0000 UTC m=+953.254217588" watchObservedRunningTime="2025-12-04 15:35:45.827618123 +0000 UTC m=+953.262287980" Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.027055 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.027128 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.027180 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.027873 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4e59e979cd83496088e0b3d97a0d08e9a57942e7fa37137c26486dd40de7195"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.027984 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://d4e59e979cd83496088e0b3d97a0d08e9a57942e7fa37137c26486dd40de7195" gracePeriod=600 Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.803601 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="d4e59e979cd83496088e0b3d97a0d08e9a57942e7fa37137c26486dd40de7195" exitCode=0 Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.803652 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"d4e59e979cd83496088e0b3d97a0d08e9a57942e7fa37137c26486dd40de7195"} Dec 04 15:35:46 crc kubenswrapper[4676]: I1204 15:35:46.804212 4676 scope.go:117] "RemoveContainer" containerID="9fe7a265e00c1d56ac021f0f7b498108db8f42348e6b750a6c0468f9b25973a9" Dec 04 15:35:47 crc kubenswrapper[4676]: I1204 15:35:47.815367 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"47374e6ac332c7bd6c641b2efeca6385b181e71dff18cb42d3770eabc6e1122b"} Dec 04 15:35:50 crc kubenswrapper[4676]: I1204 15:35:50.700611 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:50 crc kubenswrapper[4676]: I1204 15:35:50.701000 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:50 crc kubenswrapper[4676]: I1204 15:35:50.744424 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:50 crc kubenswrapper[4676]: I1204 15:35:50.896997 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:50 crc kubenswrapper[4676]: I1204 15:35:50.976316 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5w7r8"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.324837 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj"] Dec 04 15:35:51 crc kubenswrapper[4676]: E1204 15:35:51.325425 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="registry-server" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.325457 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="registry-server" Dec 04 15:35:51 crc kubenswrapper[4676]: E1204 15:35:51.325494 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="extract-utilities" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.325504 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="extract-utilities" Dec 04 15:35:51 crc kubenswrapper[4676]: E1204 15:35:51.325530 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="extract-content" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.325540 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="extract-content" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.325718 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="568a1a12-bb73-456a-bdc5-b8ff5bdd13e8" containerName="registry-server" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.326967 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.329415 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wj2dv" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.334612 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.335808 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.341079 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.342463 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.346059 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-glmkl" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.348536 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-kdvvq" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.355758 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.363116 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.424388 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvkq\" (UniqueName: \"kubernetes.io/projected/db83cc98-e9f7-4c8a-989a-ad3150de91b9-kube-api-access-jbvkq\") pod \"cinder-operator-controller-manager-748967c98-zbsm7\" (UID: \"db83cc98-e9f7-4c8a-989a-ad3150de91b9\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.424478 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcbsd\" (UniqueName: \"kubernetes.io/projected/191599a4-dee2-4d6c-b7ba-09e4f60faaf5-kube-api-access-rcbsd\") pod \"barbican-operator-controller-manager-5bfbbb859d-p52sj\" (UID: \"191599a4-dee2-4d6c-b7ba-09e4f60faaf5\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.424546 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6k9z\" (UniqueName: \"kubernetes.io/projected/c7bf3f72-274b-4db9-8822-25999acad8b6-kube-api-access-c6k9z\") pod \"designate-operator-controller-manager-6788cc6d75-hrr7c\" (UID: \"c7bf3f72-274b-4db9-8822-25999acad8b6\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.426023 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.427438 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.436151 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hrsf5" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.444674 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.458987 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.460540 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.469987 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.471436 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.482775 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nkbdm" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.483154 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nbbfx" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.525852 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6k9z\" (UniqueName: \"kubernetes.io/projected/c7bf3f72-274b-4db9-8822-25999acad8b6-kube-api-access-c6k9z\") pod \"designate-operator-controller-manager-6788cc6d75-hrr7c\" (UID: \"c7bf3f72-274b-4db9-8822-25999acad8b6\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.525957 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvkq\" (UniqueName: \"kubernetes.io/projected/db83cc98-e9f7-4c8a-989a-ad3150de91b9-kube-api-access-jbvkq\") pod \"cinder-operator-controller-manager-748967c98-zbsm7\" (UID: \"db83cc98-e9f7-4c8a-989a-ad3150de91b9\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.526018 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcbsd\" (UniqueName: \"kubernetes.io/projected/191599a4-dee2-4d6c-b7ba-09e4f60faaf5-kube-api-access-rcbsd\") pod \"barbican-operator-controller-manager-5bfbbb859d-p52sj\" (UID: \"191599a4-dee2-4d6c-b7ba-09e4f60faaf5\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.652330 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kmzl\" (UniqueName: \"kubernetes.io/projected/dbba238e-b271-48f0-9356-c1ba4b7446f8-kube-api-access-9kmzl\") pod \"heat-operator-controller-manager-698d6fd7d6-h74fd\" (UID: \"dbba238e-b271-48f0-9356-c1ba4b7446f8\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.656240 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnkz6\" (UniqueName: \"kubernetes.io/projected/1b01dbe4-9e3e-403e-938a-22f130b47202-kube-api-access-pnkz6\") pod \"horizon-operator-controller-manager-7d5d9fd47f-vf7rm\" (UID: \"1b01dbe4-9e3e-403e-938a-22f130b47202\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.656326 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn8xh\" (UniqueName: \"kubernetes.io/projected/25a6adcc-b6f7-41ee-a0d3-9594455bedda-kube-api-access-kn8xh\") pod \"glance-operator-controller-manager-85fbd69fcd-7vsrd\" (UID: \"25a6adcc-b6f7-41ee-a0d3-9594455bedda\") " pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.656556 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.661877 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.669223 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.670738 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.671803 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.688012 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.689865 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.689971 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ztd62" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.701025 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.702243 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.706429 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.707781 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.713856 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.714768 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcbsd\" (UniqueName: \"kubernetes.io/projected/191599a4-dee2-4d6c-b7ba-09e4f60faaf5-kube-api-access-rcbsd\") pod \"barbican-operator-controller-manager-5bfbbb859d-p52sj\" (UID: \"191599a4-dee2-4d6c-b7ba-09e4f60faaf5\") " pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.717695 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6bbp5" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.718920 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-r7j2k" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.721187 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6k9z\" (UniqueName: \"kubernetes.io/projected/c7bf3f72-274b-4db9-8822-25999acad8b6-kube-api-access-c6k9z\") pod \"designate-operator-controller-manager-6788cc6d75-hrr7c\" (UID: \"c7bf3f72-274b-4db9-8822-25999acad8b6\") " pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.723086 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvkq\" (UniqueName: \"kubernetes.io/projected/db83cc98-e9f7-4c8a-989a-ad3150de91b9-kube-api-access-jbvkq\") pod \"cinder-operator-controller-manager-748967c98-zbsm7\" (UID: \"db83cc98-e9f7-4c8a-989a-ad3150de91b9\") " pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.727117 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.777334 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kmzl\" (UniqueName: \"kubernetes.io/projected/dbba238e-b271-48f0-9356-c1ba4b7446f8-kube-api-access-9kmzl\") pod \"heat-operator-controller-manager-698d6fd7d6-h74fd\" (UID: \"dbba238e-b271-48f0-9356-c1ba4b7446f8\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.777424 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnkz6\" (UniqueName: \"kubernetes.io/projected/1b01dbe4-9e3e-403e-938a-22f130b47202-kube-api-access-pnkz6\") pod \"horizon-operator-controller-manager-7d5d9fd47f-vf7rm\" (UID: \"1b01dbe4-9e3e-403e-938a-22f130b47202\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.777480 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn8xh\" (UniqueName: \"kubernetes.io/projected/25a6adcc-b6f7-41ee-a0d3-9594455bedda-kube-api-access-kn8xh\") pod \"glance-operator-controller-manager-85fbd69fcd-7vsrd\" (UID: \"25a6adcc-b6f7-41ee-a0d3-9594455bedda\") " pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.804502 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnkz6\" (UniqueName: \"kubernetes.io/projected/1b01dbe4-9e3e-403e-938a-22f130b47202-kube-api-access-pnkz6\") pod \"horizon-operator-controller-manager-7d5d9fd47f-vf7rm\" (UID: \"1b01dbe4-9e3e-403e-938a-22f130b47202\") " pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.821963 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.829314 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kmzl\" (UniqueName: \"kubernetes.io/projected/dbba238e-b271-48f0-9356-c1ba4b7446f8-kube-api-access-9kmzl\") pod \"heat-operator-controller-manager-698d6fd7d6-h74fd\" (UID: \"dbba238e-b271-48f0-9356-c1ba4b7446f8\") " pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.855394 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.867593 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vrlzr" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.868738 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.870506 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.879046 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02e4b1ff-3345-4104-b333-cba2f5cd9388-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-jjrmb\" (UID: \"02e4b1ff-3345-4104-b333-cba2f5cd9388\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.879128 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z62sl\" (UniqueName: \"kubernetes.io/projected/02e4b1ff-3345-4104-b333-cba2f5cd9388-kube-api-access-z62sl\") pod \"infra-operator-controller-manager-6c55d8d69b-jjrmb\" (UID: \"02e4b1ff-3345-4104-b333-cba2f5cd9388\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.879185 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsz9c\" (UniqueName: \"kubernetes.io/projected/171288d7-22db-4357-bbfc-0f5ffa6b709c-kube-api-access-jsz9c\") pod \"ironic-operator-controller-manager-54485f899-zgqv7\" (UID: \"171288d7-22db-4357-bbfc-0f5ffa6b709c\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.879218 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj9kr\" (UniqueName: \"kubernetes.io/projected/62a08aac-45ea-4944-9d7f-9d78114d07a0-kube-api-access-dj9kr\") pod \"keystone-operator-controller-manager-79cc9d59f5-tqc5z\" (UID: \"62a08aac-45ea-4944-9d7f-9d78114d07a0\") " pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.884327 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-rlkr6" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.892756 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.900483 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn8xh\" (UniqueName: \"kubernetes.io/projected/25a6adcc-b6f7-41ee-a0d3-9594455bedda-kube-api-access-kn8xh\") pod \"glance-operator-controller-manager-85fbd69fcd-7vsrd\" (UID: \"25a6adcc-b6f7-41ee-a0d3-9594455bedda\") " pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.934077 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv"] Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.954022 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.955487 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" Dec 04 15:35:51 crc kubenswrapper[4676]: I1204 15:35:51.973726 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:51.992772 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:51.994815 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsz9c\" (UniqueName: \"kubernetes.io/projected/171288d7-22db-4357-bbfc-0f5ffa6b709c-kube-api-access-jsz9c\") pod \"ironic-operator-controller-manager-54485f899-zgqv7\" (UID: \"171288d7-22db-4357-bbfc-0f5ffa6b709c\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.007144 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj9kr\" (UniqueName: \"kubernetes.io/projected/62a08aac-45ea-4944-9d7f-9d78114d07a0-kube-api-access-dj9kr\") pod \"keystone-operator-controller-manager-79cc9d59f5-tqc5z\" (UID: \"62a08aac-45ea-4944-9d7f-9d78114d07a0\") " pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.007357 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8w8j\" (UniqueName: \"kubernetes.io/projected/ee1e0a33-feb5-4a3b-8d62-dca835529d5e-kube-api-access-n8w8j\") pod \"mariadb-operator-controller-manager-64d7c556cd-5nstv\" (UID: \"ee1e0a33-feb5-4a3b-8d62-dca835529d5e\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.007415 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02e4b1ff-3345-4104-b333-cba2f5cd9388-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-jjrmb\" (UID: \"02e4b1ff-3345-4104-b333-cba2f5cd9388\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.007467 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr98j\" (UniqueName: \"kubernetes.io/projected/d28e781c-96cf-4377-8cbc-f32b112e3dc7-kube-api-access-jr98j\") pod \"manila-operator-controller-manager-5cbc8c7f96-lpl84\" (UID: \"d28e781c-96cf-4377-8cbc-f32b112e3dc7\") " pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.007532 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z62sl\" (UniqueName: \"kubernetes.io/projected/02e4b1ff-3345-4104-b333-cba2f5cd9388-kube-api-access-z62sl\") pod \"infra-operator-controller-manager-6c55d8d69b-jjrmb\" (UID: \"02e4b1ff-3345-4104-b333-cba2f5cd9388\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.032034 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/02e4b1ff-3345-4104-b333-cba2f5cd9388-cert\") pod \"infra-operator-controller-manager-6c55d8d69b-jjrmb\" (UID: \"02e4b1ff-3345-4104-b333-cba2f5cd9388\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.032147 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.045289 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.071358 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.086314 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.099630 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-hnfgn" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.108274 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.108711 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8w8j\" (UniqueName: \"kubernetes.io/projected/ee1e0a33-feb5-4a3b-8d62-dca835529d5e-kube-api-access-n8w8j\") pod \"mariadb-operator-controller-manager-64d7c556cd-5nstv\" (UID: \"ee1e0a33-feb5-4a3b-8d62-dca835529d5e\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.108770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr98j\" (UniqueName: \"kubernetes.io/projected/d28e781c-96cf-4377-8cbc-f32b112e3dc7-kube-api-access-jr98j\") pod \"manila-operator-controller-manager-5cbc8c7f96-lpl84\" (UID: \"d28e781c-96cf-4377-8cbc-f32b112e3dc7\") " pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.108820 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8bsm\" (UniqueName: \"kubernetes.io/projected/f5882b54-a120-4eff-88e8-bf0a5d7758ff-kube-api-access-s8bsm\") pod \"neutron-operator-controller-manager-58879495c-g2b6v\" (UID: \"f5882b54-a120-4eff-88e8-bf0a5d7758ff\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.110061 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.116536 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-j7tk5" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.141000 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.142425 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.164248 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.175729 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z62sl\" (UniqueName: \"kubernetes.io/projected/02e4b1ff-3345-4104-b333-cba2f5cd9388-kube-api-access-z62sl\") pod \"infra-operator-controller-manager-6c55d8d69b-jjrmb\" (UID: \"02e4b1ff-3345-4104-b333-cba2f5cd9388\") " pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.176627 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-xtddq" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.192475 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8w8j\" (UniqueName: \"kubernetes.io/projected/ee1e0a33-feb5-4a3b-8d62-dca835529d5e-kube-api-access-n8w8j\") pod \"mariadb-operator-controller-manager-64d7c556cd-5nstv\" (UID: \"ee1e0a33-feb5-4a3b-8d62-dca835529d5e\") " pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.202640 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr98j\" (UniqueName: \"kubernetes.io/projected/d28e781c-96cf-4377-8cbc-f32b112e3dc7-kube-api-access-jr98j\") pod \"manila-operator-controller-manager-5cbc8c7f96-lpl84\" (UID: \"d28e781c-96cf-4377-8cbc-f32b112e3dc7\") " pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.211078 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hkj\" (UniqueName: \"kubernetes.io/projected/9890ab17-b307-4506-9420-0a50e671792e-kube-api-access-v4hkj\") pod \"octavia-operator-controller-manager-d5fb87cb8-tg7br\" (UID: \"9890ab17-b307-4506-9420-0a50e671792e\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.211145 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8bsm\" (UniqueName: \"kubernetes.io/projected/f5882b54-a120-4eff-88e8-bf0a5d7758ff-kube-api-access-s8bsm\") pod \"neutron-operator-controller-manager-58879495c-g2b6v\" (UID: \"f5882b54-a120-4eff-88e8-bf0a5d7758ff\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.211240 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhm9j\" (UniqueName: \"kubernetes.io/projected/7d5162d9-add8-44b3-8301-82cbd7d09878-kube-api-access-fhm9j\") pod \"nova-operator-controller-manager-79d658b66d-nxgnw\" (UID: \"7d5162d9-add8-44b3-8301-82cbd7d09878\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.222217 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.250356 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.252664 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.255877 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj9kr\" (UniqueName: \"kubernetes.io/projected/62a08aac-45ea-4944-9d7f-9d78114d07a0-kube-api-access-dj9kr\") pod \"keystone-operator-controller-manager-79cc9d59f5-tqc5z\" (UID: \"62a08aac-45ea-4944-9d7f-9d78114d07a0\") " pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.260631 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsz9c\" (UniqueName: \"kubernetes.io/projected/171288d7-22db-4357-bbfc-0f5ffa6b709c-kube-api-access-jsz9c\") pod \"ironic-operator-controller-manager-54485f899-zgqv7\" (UID: \"171288d7-22db-4357-bbfc-0f5ffa6b709c\") " pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.273481 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kqbcl" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.276178 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.304024 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.322967 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8bsm\" (UniqueName: \"kubernetes.io/projected/f5882b54-a120-4eff-88e8-bf0a5d7758ff-kube-api-access-s8bsm\") pod \"neutron-operator-controller-manager-58879495c-g2b6v\" (UID: \"f5882b54-a120-4eff-88e8-bf0a5d7758ff\") " pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.323840 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-t4p48"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.325444 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.325752 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hkj\" (UniqueName: \"kubernetes.io/projected/9890ab17-b307-4506-9420-0a50e671792e-kube-api-access-v4hkj\") pod \"octavia-operator-controller-manager-d5fb87cb8-tg7br\" (UID: \"9890ab17-b307-4506-9420-0a50e671792e\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.325852 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2vpf\" (UniqueName: \"kubernetes.io/projected/d373173f-fba9-4fc1-9d7d-5424dca0303e-kube-api-access-g2vpf\") pod \"ovn-operator-controller-manager-5b67cfc8fb-7g426\" (UID: \"d373173f-fba9-4fc1-9d7d-5424dca0303e\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.325950 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhm9j\" (UniqueName: \"kubernetes.io/projected/7d5162d9-add8-44b3-8301-82cbd7d09878-kube-api-access-fhm9j\") pod \"nova-operator-controller-manager-79d658b66d-nxgnw\" (UID: \"7d5162d9-add8-44b3-8301-82cbd7d09878\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.347964 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-t4p48"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.445401 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.448156 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2vpf\" (UniqueName: \"kubernetes.io/projected/d373173f-fba9-4fc1-9d7d-5424dca0303e-kube-api-access-g2vpf\") pod \"ovn-operator-controller-manager-5b67cfc8fb-7g426\" (UID: \"d373173f-fba9-4fc1-9d7d-5424dca0303e\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.460320 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.484551 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-h8w5p" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.498825 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.500147 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.501330 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.502442 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.502809 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.503896 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-84m9m" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.504148 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.538655 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhm9j\" (UniqueName: \"kubernetes.io/projected/7d5162d9-add8-44b3-8301-82cbd7d09878-kube-api-access-fhm9j\") pod \"nova-operator-controller-manager-79d658b66d-nxgnw\" (UID: \"7d5162d9-add8-44b3-8301-82cbd7d09878\") " pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.553422 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.553950 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.554758 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbsm4\" (UniqueName: \"kubernetes.io/projected/53683a17-2c47-4b4c-b145-74620d4d7a16-kube-api-access-pbsm4\") pod \"placement-operator-controller-manager-867d87977b-t4p48\" (UID: \"53683a17-2c47-4b4c-b145-74620d4d7a16\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.562241 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.567113 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hkj\" (UniqueName: \"kubernetes.io/projected/9890ab17-b307-4506-9420-0a50e671792e-kube-api-access-v4hkj\") pod \"octavia-operator-controller-manager-d5fb87cb8-tg7br\" (UID: \"9890ab17-b307-4506-9420-0a50e671792e\") " pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.567536 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2vpf\" (UniqueName: \"kubernetes.io/projected/d373173f-fba9-4fc1-9d7d-5424dca0303e-kube-api-access-g2vpf\") pod \"ovn-operator-controller-manager-5b67cfc8fb-7g426\" (UID: \"d373173f-fba9-4fc1-9d7d-5424dca0303e\") " pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.583148 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.583270 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.597034 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.611928 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-sb669" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.622801 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.640833 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.641390 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.656457 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a67582d-5c84-40fc-977b-4c0d42d9864b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-f29bx\" (UID: \"8a67582d-5c84-40fc-977b-4c0d42d9864b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.656521 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6fv9\" (UniqueName: \"kubernetes.io/projected/8a67582d-5c84-40fc-977b-4c0d42d9864b-kube-api-access-m6fv9\") pod \"openstack-baremetal-operator-controller-manager-77868f484-f29bx\" (UID: \"8a67582d-5c84-40fc-977b-4c0d42d9864b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.656588 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbsm4\" (UniqueName: \"kubernetes.io/projected/53683a17-2c47-4b4c-b145-74620d4d7a16-kube-api-access-pbsm4\") pod \"placement-operator-controller-manager-867d87977b-t4p48\" (UID: \"53683a17-2c47-4b4c-b145-74620d4d7a16\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.656670 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2wb\" (UniqueName: \"kubernetes.io/projected/255159ec-7751-4663-a0b9-0e97f9c0824d-kube-api-access-8c2wb\") pod \"swift-operator-controller-manager-8f6687c44-24pgj\" (UID: \"255159ec-7751-4663-a0b9-0e97f9c0824d\") " pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.657805 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2cfbb" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.673425 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.697450 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbsm4\" (UniqueName: \"kubernetes.io/projected/53683a17-2c47-4b4c-b145-74620d4d7a16-kube-api-access-pbsm4\") pod \"placement-operator-controller-manager-867d87977b-t4p48\" (UID: \"53683a17-2c47-4b4c-b145-74620d4d7a16\") " pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.697481 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.738808 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.740453 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.745298 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-sqtjn" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.758247 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a67582d-5c84-40fc-977b-4c0d42d9864b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-f29bx\" (UID: \"8a67582d-5c84-40fc-977b-4c0d42d9864b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.758411 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6fv9\" (UniqueName: \"kubernetes.io/projected/8a67582d-5c84-40fc-977b-4c0d42d9864b-kube-api-access-m6fv9\") pod \"openstack-baremetal-operator-controller-manager-77868f484-f29bx\" (UID: \"8a67582d-5c84-40fc-977b-4c0d42d9864b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.758539 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vg45\" (UniqueName: \"kubernetes.io/projected/66135fe6-10ac-4049-b7a7-e40aa82f78e7-kube-api-access-5vg45\") pod \"telemetry-operator-controller-manager-695797c565-mtxgd\" (UID: \"66135fe6-10ac-4049-b7a7-e40aa82f78e7\") " pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.758590 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2wb\" (UniqueName: \"kubernetes.io/projected/255159ec-7751-4663-a0b9-0e97f9c0824d-kube-api-access-8c2wb\") pod \"swift-operator-controller-manager-8f6687c44-24pgj\" (UID: \"255159ec-7751-4663-a0b9-0e97f9c0824d\") " pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" Dec 04 15:35:52 crc kubenswrapper[4676]: E1204 15:35:52.758809 4676 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 15:35:52 crc kubenswrapper[4676]: E1204 15:35:52.759069 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a67582d-5c84-40fc-977b-4c0d42d9864b-cert podName:8a67582d-5c84-40fc-977b-4c0d42d9864b nodeName:}" failed. No retries permitted until 2025-12-04 15:35:53.259005204 +0000 UTC m=+960.693675071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a67582d-5c84-40fc-977b-4c0d42d9864b-cert") pod "openstack-baremetal-operator-controller-manager-77868f484-f29bx" (UID: "8a67582d-5c84-40fc-977b-4c0d42d9864b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.788013 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.906205 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.915317 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.923605 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fbzpd" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.928341 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vg45\" (UniqueName: \"kubernetes.io/projected/66135fe6-10ac-4049-b7a7-e40aa82f78e7-kube-api-access-5vg45\") pod \"telemetry-operator-controller-manager-695797c565-mtxgd\" (UID: \"66135fe6-10ac-4049-b7a7-e40aa82f78e7\") " pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.928517 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v82w\" (UniqueName: \"kubernetes.io/projected/a2059da3-6c0d-4623-8406-5f25aed58fbf-kube-api-access-6v82w\") pod \"test-operator-controller-manager-bb86466d8-x7nbg\" (UID: \"a2059da3-6c0d-4623-8406-5f25aed58fbf\") " pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.951246 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6fv9\" (UniqueName: \"kubernetes.io/projected/8a67582d-5c84-40fc-977b-4c0d42d9864b-kube-api-access-m6fv9\") pod \"openstack-baremetal-operator-controller-manager-77868f484-f29bx\" (UID: \"8a67582d-5c84-40fc-977b-4c0d42d9864b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.954452 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2wb\" (UniqueName: \"kubernetes.io/projected/255159ec-7751-4663-a0b9-0e97f9c0824d-kube-api-access-8c2wb\") pod \"swift-operator-controller-manager-8f6687c44-24pgj\" (UID: \"255159ec-7751-4663-a0b9-0e97f9c0824d\") " pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.955514 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5w7r8" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="registry-server" containerID="cri-o://ad5cfbd980478aebe9461e59887a8a786b41bd492e87e9c2f3011a9dbd6c80bf" gracePeriod=2 Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.956299 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg"] Dec 04 15:35:52 crc kubenswrapper[4676]: I1204 15:35:52.970340 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.012560 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vg45\" (UniqueName: \"kubernetes.io/projected/66135fe6-10ac-4049-b7a7-e40aa82f78e7-kube-api-access-5vg45\") pod \"telemetry-operator-controller-manager-695797c565-mtxgd\" (UID: \"66135fe6-10ac-4049-b7a7-e40aa82f78e7\") " pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.018026 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5"] Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.029711 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v82w\" (UniqueName: \"kubernetes.io/projected/a2059da3-6c0d-4623-8406-5f25aed58fbf-kube-api-access-6v82w\") pod \"test-operator-controller-manager-bb86466d8-x7nbg\" (UID: \"a2059da3-6c0d-4623-8406-5f25aed58fbf\") " pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.030147 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnf7q\" (UniqueName: \"kubernetes.io/projected/93e0c78f-854f-4c11-b457-f5e1b429a7bc-kube-api-access-qnf7q\") pod \"watcher-operator-controller-manager-6c44f899f9-n7xc5\" (UID: \"93e0c78f-854f-4c11-b457-f5e1b429a7bc\") " pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.050352 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62"] Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.052032 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.054196 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ngzmq" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.054460 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.080520 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v82w\" (UniqueName: \"kubernetes.io/projected/a2059da3-6c0d-4623-8406-5f25aed58fbf-kube-api-access-6v82w\") pod \"test-operator-controller-manager-bb86466d8-x7nbg\" (UID: \"a2059da3-6c0d-4623-8406-5f25aed58fbf\") " pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.090979 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62"] Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.116364 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55"] Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.117826 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.119855 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-txqbf" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.127461 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55"] Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.131416 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnf7q\" (UniqueName: \"kubernetes.io/projected/93e0c78f-854f-4c11-b457-f5e1b429a7bc-kube-api-access-qnf7q\") pod \"watcher-operator-controller-manager-6c44f899f9-n7xc5\" (UID: \"93e0c78f-854f-4c11-b457-f5e1b429a7bc\") " pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.131483 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mftj5\" (UniqueName: \"kubernetes.io/projected/468399f0-8b75-47d3-9576-fc4f572fc422-kube-api-access-mftj5\") pod \"openstack-operator-controller-manager-6b8756448-bqf62\" (UID: \"468399f0-8b75-47d3-9576-fc4f572fc422\") " pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.131567 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blt9l\" (UniqueName: \"kubernetes.io/projected/3b483864-ee9a-49b1-b75f-5f9b23e9534d-kube-api-access-blt9l\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ngw55\" (UID: \"3b483864-ee9a-49b1-b75f-5f9b23e9534d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.131596 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/468399f0-8b75-47d3-9576-fc4f572fc422-cert\") pod \"openstack-operator-controller-manager-6b8756448-bqf62\" (UID: \"468399f0-8b75-47d3-9576-fc4f572fc422\") " pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.162874 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnf7q\" (UniqueName: \"kubernetes.io/projected/93e0c78f-854f-4c11-b457-f5e1b429a7bc-kube-api-access-qnf7q\") pod \"watcher-operator-controller-manager-6c44f899f9-n7xc5\" (UID: \"93e0c78f-854f-4c11-b457-f5e1b429a7bc\") " pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.236820 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blt9l\" (UniqueName: \"kubernetes.io/projected/3b483864-ee9a-49b1-b75f-5f9b23e9534d-kube-api-access-blt9l\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ngw55\" (UID: \"3b483864-ee9a-49b1-b75f-5f9b23e9534d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.236896 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/468399f0-8b75-47d3-9576-fc4f572fc422-cert\") pod \"openstack-operator-controller-manager-6b8756448-bqf62\" (UID: \"468399f0-8b75-47d3-9576-fc4f572fc422\") " pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.237033 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mftj5\" (UniqueName: \"kubernetes.io/projected/468399f0-8b75-47d3-9576-fc4f572fc422-kube-api-access-mftj5\") pod \"openstack-operator-controller-manager-6b8756448-bqf62\" (UID: \"468399f0-8b75-47d3-9576-fc4f572fc422\") " pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.244615 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/468399f0-8b75-47d3-9576-fc4f572fc422-cert\") pod \"openstack-operator-controller-manager-6b8756448-bqf62\" (UID: \"468399f0-8b75-47d3-9576-fc4f572fc422\") " pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.400194 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a67582d-5c84-40fc-977b-4c0d42d9864b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-f29bx\" (UID: \"8a67582d-5c84-40fc-977b-4c0d42d9864b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.409199 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-2cfbb" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.414870 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a67582d-5c84-40fc-977b-4c0d42d9864b-cert\") pod \"openstack-baremetal-operator-controller-manager-77868f484-f29bx\" (UID: \"8a67582d-5c84-40fc-977b-4c0d42d9864b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.415023 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.427518 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mftj5\" (UniqueName: \"kubernetes.io/projected/468399f0-8b75-47d3-9576-fc4f572fc422-kube-api-access-mftj5\") pod \"openstack-operator-controller-manager-6b8756448-bqf62\" (UID: \"468399f0-8b75-47d3-9576-fc4f572fc422\") " pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.436933 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blt9l\" (UniqueName: \"kubernetes.io/projected/3b483864-ee9a-49b1-b75f-5f9b23e9534d-kube-api-access-blt9l\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ngw55\" (UID: \"3b483864-ee9a-49b1-b75f-5f9b23e9534d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.486122 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-84m9m" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.494897 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.550236 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-sqtjn" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.551973 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd"] Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.558404 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" Dec 04 15:35:53 crc kubenswrapper[4676]: W1204 15:35:53.570182 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbba238e_b271_48f0_9356_c1ba4b7446f8.slice/crio-8bee94490742988e3418a99d621e10d5ba856d71f2c389c2712fc726c6e7b900 WatchSource:0}: Error finding container 8bee94490742988e3418a99d621e10d5ba856d71f2c389c2712fc726c6e7b900: Status 404 returned error can't find the container with id 8bee94490742988e3418a99d621e10d5ba856d71f2c389c2712fc726c6e7b900 Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.573331 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-fbzpd" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.581631 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.598707 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ngzmq" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.606676 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.616973 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-txqbf" Dec 04 15:35:53 crc kubenswrapper[4676]: I1204 15:35:53.626614 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.014468 4676 generic.go:334] "Generic (PLEG): container finished" podID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerID="ad5cfbd980478aebe9461e59887a8a786b41bd492e87e9c2f3011a9dbd6c80bf" exitCode=0 Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.014774 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w7r8" event={"ID":"395ffe4b-ade5-4326-8a64-03892c41efd7","Type":"ContainerDied","Data":"ad5cfbd980478aebe9461e59887a8a786b41bd492e87e9c2f3011a9dbd6c80bf"} Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.018410 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" event={"ID":"dbba238e-b271-48f0-9356-c1ba4b7446f8","Type":"ContainerStarted","Data":"8bee94490742988e3418a99d621e10d5ba856d71f2c389c2712fc726c6e7b900"} Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.337082 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.475700 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.487667 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-utilities\") pod \"395ffe4b-ade5-4326-8a64-03892c41efd7\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.487732 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-catalog-content\") pod \"395ffe4b-ade5-4326-8a64-03892c41efd7\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.487854 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk47c\" (UniqueName: \"kubernetes.io/projected/395ffe4b-ade5-4326-8a64-03892c41efd7-kube-api-access-qk47c\") pod \"395ffe4b-ade5-4326-8a64-03892c41efd7\" (UID: \"395ffe4b-ade5-4326-8a64-03892c41efd7\") " Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.491185 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-utilities" (OuterVolumeSpecName: "utilities") pod "395ffe4b-ade5-4326-8a64-03892c41efd7" (UID: "395ffe4b-ade5-4326-8a64-03892c41efd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.494836 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.510356 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.517256 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395ffe4b-ade5-4326-8a64-03892c41efd7-kube-api-access-qk47c" (OuterVolumeSpecName: "kube-api-access-qk47c") pod "395ffe4b-ade5-4326-8a64-03892c41efd7" (UID: "395ffe4b-ade5-4326-8a64-03892c41efd7"). InnerVolumeSpecName "kube-api-access-qk47c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.611678 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk47c\" (UniqueName: \"kubernetes.io/projected/395ffe4b-ade5-4326-8a64-03892c41efd7-kube-api-access-qk47c\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.611723 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.690278 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "395ffe4b-ade5-4326-8a64-03892c41efd7" (UID: "395ffe4b-ade5-4326-8a64-03892c41efd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.713704 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/395ffe4b-ade5-4326-8a64-03892c41efd7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.914822 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.931925 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.940655 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.953586 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.963854 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.971013 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.978445 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.985552 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv"] Dec 04 15:35:54 crc kubenswrapper[4676]: I1204 15:35:54.996681 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb"] Dec 04 15:35:54 crc kubenswrapper[4676]: W1204 15:35:54.996717 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd373173f_fba9_4fc1_9d7d_5424dca0303e.slice/crio-6541ac91347f09b38dad76d16661f8b8ff45e1ebe9a4e3f630701a66415839e2 WatchSource:0}: Error finding container 6541ac91347f09b38dad76d16661f8b8ff45e1ebe9a4e3f630701a66415839e2: Status 404 returned error can't find the container with id 6541ac91347f09b38dad76d16661f8b8ff45e1ebe9a4e3f630701a66415839e2 Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.007565 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7"] Dec 04 15:35:55 crc kubenswrapper[4676]: W1204 15:35:55.011110 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee1e0a33_feb5_4a3b_8d62_dca835529d5e.slice/crio-08bde93ad278e1f15a29a8fa662bf58b31114d87f824326e6caf41c279f5a77e WatchSource:0}: Error finding container 08bde93ad278e1f15a29a8fa662bf58b31114d87f824326e6caf41c279f5a77e: Status 404 returned error can't find the container with id 08bde93ad278e1f15a29a8fa662bf58b31114d87f824326e6caf41c279f5a77e Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.023447 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n8w8j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-64d7c556cd-5nstv_openstack-operators(ee1e0a33-feb5-4a3b-8d62-dca835529d5e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.033694 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" event={"ID":"02e4b1ff-3345-4104-b333-cba2f5cd9388","Type":"ContainerStarted","Data":"b9a500914b4ce16bf6e092612d4374027eede136bdd52674c8703425a9f60e97"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.042441 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" event={"ID":"ee1e0a33-feb5-4a3b-8d62-dca835529d5e","Type":"ContainerStarted","Data":"08bde93ad278e1f15a29a8fa662bf58b31114d87f824326e6caf41c279f5a77e"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.043832 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" event={"ID":"7d5162d9-add8-44b3-8301-82cbd7d09878","Type":"ContainerStarted","Data":"ed04629cf93eaffb622869d43096632b8f0dab74bd257ecf829a55b45a4f0a58"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.202362 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" event={"ID":"25a6adcc-b6f7-41ee-a0d3-9594455bedda","Type":"ContainerStarted","Data":"fbe7a7e11bcd383416480fa926bb653e3b93925d03ab691e64439381c0ece89a"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.204821 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" event={"ID":"db83cc98-e9f7-4c8a-989a-ad3150de91b9","Type":"ContainerStarted","Data":"4d05920917300df0cce1e208664322c7bfec2003c401afb791f5487d6834143f"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.208379 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" event={"ID":"62a08aac-45ea-4944-9d7f-9d78114d07a0","Type":"ContainerStarted","Data":"b12f9b754b4736c73819c5165b275a7f3dd0ae0425fb653487f7935b1af5eab2"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.211195 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" event={"ID":"c7bf3f72-274b-4db9-8822-25999acad8b6","Type":"ContainerStarted","Data":"a90124a923c8e69f4f8c462098dafd01b46bd14b7a7d47712d3c03eeea2b11ae"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.213412 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" event={"ID":"f5882b54-a120-4eff-88e8-bf0a5d7758ff","Type":"ContainerStarted","Data":"4a4193297f8ac1a918ebf4514339dd3349c2f41ca922c15e637d34d95e022ed2"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.218101 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" event={"ID":"1b01dbe4-9e3e-403e-938a-22f130b47202","Type":"ContainerStarted","Data":"884ef1fe608e3784d855ed327e92dacde1eaa564119723af747ae433a265b3dd"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.247412 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" event={"ID":"171288d7-22db-4357-bbfc-0f5ffa6b709c","Type":"ContainerStarted","Data":"0b1764818f9b65ee12b00caaa13c1eed2c502db8f0529c83bbe4f5d2c0f5c48c"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.249206 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" event={"ID":"191599a4-dee2-4d6c-b7ba-09e4f60faaf5","Type":"ContainerStarted","Data":"92aaaa69e7408be734a5b814c44d272e46e68e12addf8f51eb1cbdafc43325c5"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.256584 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" event={"ID":"d28e781c-96cf-4377-8cbc-f32b112e3dc7","Type":"ContainerStarted","Data":"8ca97dd07e9d84009e7fd849dbbfe3b9fc068576f8d0886d7d7d2f3029b5f3f0"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.275869 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" event={"ID":"d373173f-fba9-4fc1-9d7d-5424dca0303e","Type":"ContainerStarted","Data":"6541ac91347f09b38dad76d16661f8b8ff45e1ebe9a4e3f630701a66415839e2"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.285392 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5"] Dec 04 15:35:55 crc kubenswrapper[4676]: W1204 15:35:55.285863 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod468399f0_8b75_47d3_9576_fc4f572fc422.slice/crio-6b5a4df1e80ff278c6919268c87f00c79ff66d09b7a1989ca96ca709d7ceaf83 WatchSource:0}: Error finding container 6b5a4df1e80ff278c6919268c87f00c79ff66d09b7a1989ca96ca709d7ceaf83: Status 404 returned error can't find the container with id 6b5a4df1e80ff278c6919268c87f00c79ff66d09b7a1989ca96ca709d7ceaf83 Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.293524 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd"] Dec 04 15:35:55 crc kubenswrapper[4676]: W1204 15:35:55.298548 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod255159ec_7751_4663_a0b9_0e97f9c0824d.slice/crio-8c8773e5a76edfef223d27488eebe5fca9e151661f3b153dba956e0eb19d4bb2 WatchSource:0}: Error finding container 8c8773e5a76edfef223d27488eebe5fca9e151661f3b153dba956e0eb19d4bb2: Status 404 returned error can't find the container with id 8c8773e5a76edfef223d27488eebe5fca9e151661f3b153dba956e0eb19d4bb2 Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.300448 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg"] Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.301382 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5w7r8" event={"ID":"395ffe4b-ade5-4326-8a64-03892c41efd7","Type":"ContainerDied","Data":"f06ebcca7f2da07a58dc3ca32dc55b2d914fcf679fecb3d38c5f62586c78d33d"} Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.301460 4676 scope.go:117] "RemoveContainer" containerID="ad5cfbd980478aebe9461e59887a8a786b41bd492e87e9c2f3011a9dbd6c80bf" Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.301690 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5w7r8" Dec 04 15:35:55 crc kubenswrapper[4676]: W1204 15:35:55.308124 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2059da3_6c0d_4623_8406_5f25aed58fbf.slice/crio-de06e9b397276d4b51c2cb04c5e17a092a4cbd62f666f6067066886daf974e7d WatchSource:0}: Error finding container de06e9b397276d4b51c2cb04c5e17a092a4cbd62f666f6067066886daf974e7d: Status 404 returned error can't find the container with id de06e9b397276d4b51c2cb04c5e17a092a4cbd62f666f6067066886daf974e7d Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.309376 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55"] Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.315339 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:fd917de0cf800ec284ee0c3f2906a06d85ea18cb75a5b06c8eb305750467986d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbsm4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-867d87977b-t4p48_openstack-operators(53683a17-2c47-4b4c-b145-74620d4d7a16): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 15:35:55 crc kubenswrapper[4676]: W1204 15:35:55.317375 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93e0c78f_854f_4c11_b457_f5e1b429a7bc.slice/crio-98308017f49ae5f3132494bb556e14707c120e7b7d8829481ec0ded6925f1189 WatchSource:0}: Error finding container 98308017f49ae5f3132494bb556e14707c120e7b7d8829481ec0ded6925f1189: Status 404 returned error can't find the container with id 98308017f49ae5f3132494bb556e14707c120e7b7d8829481ec0ded6925f1189 Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.318444 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj"] Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.320547 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.129.56.200:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qnf7q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c44f899f9-n7xc5_openstack-operators(93e0c78f-854f-4c11-b457-f5e1b429a7bc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.331728 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br"] Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.348311 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62"] Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.350540 4676 scope.go:117] "RemoveContainer" containerID="e8692287b318a49a22e70347c3b1f2280f5309b031733dfb757d4eb47d1b873b" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.351528 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-blt9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-ngw55_openstack-operators(3b483864-ee9a-49b1-b75f-5f9b23e9534d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.351541 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m6fv9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77868f484-f29bx_openstack-operators(8a67582d-5c84-40fc-977b-4c0d42d9864b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.351865 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:78d91c3cdd5eda41c2cd6d4a8491844e161dc33f6221be8cb822b2107d7ff46f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5vg45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-695797c565-mtxgd_openstack-operators(66135fe6-10ac-4049-b7a7-e40aa82f78e7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.352642 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" podUID="3b483864-ee9a-49b1-b75f-5f9b23e9534d" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.351875 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:5245e851b4476baecd4173eca3e8669ac09ec69d36ad1ebc3a0f867713cbc14b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v4hkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-d5fb87cb8-tg7br_openstack-operators(9890ab17-b307-4506-9420-0a50e671792e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.355236 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx"] Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.360062 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-867d87977b-t4p48"] Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.366736 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5w7r8"] Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.374750 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5w7r8"] Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.388072 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" podUID="ee1e0a33-feb5-4a3b-8d62-dca835529d5e" Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.392800 4676 scope.go:117] "RemoveContainer" containerID="f3112355643b17cd89d19da49e71e8d385203467f314b456f18499248a11fdd6" Dec 04 15:35:55 crc kubenswrapper[4676]: I1204 15:35:55.402072 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" path="/var/lib/kubelet/pods/395ffe4b-ade5-4326-8a64-03892c41efd7/volumes" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.597737 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" podUID="53683a17-2c47-4b4c-b145-74620d4d7a16" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.865979 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" podUID="8a67582d-5c84-40fc-977b-4c0d42d9864b" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.906398 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" podUID="66135fe6-10ac-4049-b7a7-e40aa82f78e7" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.922949 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" podUID="93e0c78f-854f-4c11-b457-f5e1b429a7bc" Dec 04 15:35:55 crc kubenswrapper[4676]: E1204 15:35:55.970535 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" podUID="9890ab17-b307-4506-9420-0a50e671792e" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.353378 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" event={"ID":"468399f0-8b75-47d3-9576-fc4f572fc422","Type":"ContainerStarted","Data":"df515a3b40eb983b935f60499e995591d79678e790d08b91bb77d8479bb54d86"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.353430 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" event={"ID":"468399f0-8b75-47d3-9576-fc4f572fc422","Type":"ContainerStarted","Data":"af385ebf2f479f8be0b5da09f71feb1c1e3e6e53791363cd196bfc50f4e0b9fe"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.353449 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" event={"ID":"468399f0-8b75-47d3-9576-fc4f572fc422","Type":"ContainerStarted","Data":"6b5a4df1e80ff278c6919268c87f00c79ff66d09b7a1989ca96ca709d7ceaf83"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.353695 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.355961 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" event={"ID":"ee1e0a33-feb5-4a3b-8d62-dca835529d5e","Type":"ContainerStarted","Data":"a3e3a5c0d95a1a4266ac8e790557e1c0a1debfc4217040a502c2f1f3b2c54224"} Dec 04 15:35:56 crc kubenswrapper[4676]: E1204 15:35:56.357781 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" podUID="ee1e0a33-feb5-4a3b-8d62-dca835529d5e" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.360381 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" event={"ID":"9890ab17-b307-4506-9420-0a50e671792e","Type":"ContainerStarted","Data":"5d23e87a323bdcb114e88ada1b2187c4f793e5217492a80c30580dd3c365e0a6"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.360418 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" event={"ID":"9890ab17-b307-4506-9420-0a50e671792e","Type":"ContainerStarted","Data":"813b2d7d6952332f9f458d46aa86eb9c7663fdfd3bfb83175ddd73a9f7ad3568"} Dec 04 15:35:56 crc kubenswrapper[4676]: E1204 15:35:56.362305 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:5245e851b4476baecd4173eca3e8669ac09ec69d36ad1ebc3a0f867713cbc14b\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" podUID="9890ab17-b307-4506-9420-0a50e671792e" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.371147 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" event={"ID":"3b483864-ee9a-49b1-b75f-5f9b23e9534d","Type":"ContainerStarted","Data":"268e489f1a7fc45ad6d0ea386d4a0c232a02bade1fd23e5edd9eb9d0f6aac1c0"} Dec 04 15:35:56 crc kubenswrapper[4676]: E1204 15:35:56.373036 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" podUID="3b483864-ee9a-49b1-b75f-5f9b23e9534d" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.373820 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" event={"ID":"a2059da3-6c0d-4623-8406-5f25aed58fbf","Type":"ContainerStarted","Data":"de06e9b397276d4b51c2cb04c5e17a092a4cbd62f666f6067066886daf974e7d"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.398285 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" event={"ID":"8a67582d-5c84-40fc-977b-4c0d42d9864b","Type":"ContainerStarted","Data":"b7aec66bd5ceff3709d4ba6fcff9ac6dc0bd5bdd8ddc8f0627cfc149815b7bcb"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.398340 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" event={"ID":"8a67582d-5c84-40fc-977b-4c0d42d9864b","Type":"ContainerStarted","Data":"ae32d9417b9237cf428c78ae71ca2acabc7c58f886e5b466c79dcca60e601946"} Dec 04 15:35:56 crc kubenswrapper[4676]: E1204 15:35:56.428559 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" podUID="8a67582d-5c84-40fc-977b-4c0d42d9864b" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.432325 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" podStartSLOduration=4.432286895 podStartE2EDuration="4.432286895s" podCreationTimestamp="2025-12-04 15:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:35:56.429179886 +0000 UTC m=+963.863849763" watchObservedRunningTime="2025-12-04 15:35:56.432286895 +0000 UTC m=+963.866956752" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.433479 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" event={"ID":"66135fe6-10ac-4049-b7a7-e40aa82f78e7","Type":"ContainerStarted","Data":"b599e295f5b099bff4096eb9a1c371457fac90f52822bf523dd7af9e40eeadbb"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.433529 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" event={"ID":"66135fe6-10ac-4049-b7a7-e40aa82f78e7","Type":"ContainerStarted","Data":"937b112828afa18d5dcf906a8005cd88d46dc47d9d256e63ec606f30882df9db"} Dec 04 15:35:56 crc kubenswrapper[4676]: E1204 15:35:56.438571 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:78d91c3cdd5eda41c2cd6d4a8491844e161dc33f6221be8cb822b2107d7ff46f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" podUID="66135fe6-10ac-4049-b7a7-e40aa82f78e7" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.442874 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" event={"ID":"93e0c78f-854f-4c11-b457-f5e1b429a7bc","Type":"ContainerStarted","Data":"879bbbc586f6243c3479f84a6bbf9e9cb14a0fdf028e42cb15612887edbbbf22"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.444034 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" event={"ID":"93e0c78f-854f-4c11-b457-f5e1b429a7bc","Type":"ContainerStarted","Data":"98308017f49ae5f3132494bb556e14707c120e7b7d8829481ec0ded6925f1189"} Dec 04 15:35:56 crc kubenswrapper[4676]: E1204 15:35:56.456226 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" podUID="93e0c78f-854f-4c11-b457-f5e1b429a7bc" Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.473090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" event={"ID":"255159ec-7751-4663-a0b9-0e97f9c0824d","Type":"ContainerStarted","Data":"8c8773e5a76edfef223d27488eebe5fca9e151661f3b153dba956e0eb19d4bb2"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.492080 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" event={"ID":"53683a17-2c47-4b4c-b145-74620d4d7a16","Type":"ContainerStarted","Data":"41b46e30b85f7724730efa9b5335d2d0361683eba647f2b8a01d5eb83bb946ee"} Dec 04 15:35:56 crc kubenswrapper[4676]: I1204 15:35:56.492137 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" event={"ID":"53683a17-2c47-4b4c-b145-74620d4d7a16","Type":"ContainerStarted","Data":"e7ff206f254bf174f045985eb910e204fd00a445a9eea6e6df358ba7f2853ad9"} Dec 04 15:35:56 crc kubenswrapper[4676]: E1204 15:35:56.494145 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:fd917de0cf800ec284ee0c3f2906a06d85ea18cb75a5b06c8eb305750467986d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" podUID="53683a17-2c47-4b4c-b145-74620d4d7a16" Dec 04 15:35:57 crc kubenswrapper[4676]: E1204 15:35:57.529867 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/openstack-k8s-operators/watcher-operator:5e4745ad47403efffe48968672fd7cd5232c27d7\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" podUID="93e0c78f-854f-4c11-b457-f5e1b429a7bc" Dec 04 15:35:57 crc kubenswrapper[4676]: E1204 15:35:57.531247 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:78d91c3cdd5eda41c2cd6d4a8491844e161dc33f6221be8cb822b2107d7ff46f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" podUID="66135fe6-10ac-4049-b7a7-e40aa82f78e7" Dec 04 15:35:57 crc kubenswrapper[4676]: E1204 15:35:57.535305 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" podUID="3b483864-ee9a-49b1-b75f-5f9b23e9534d" Dec 04 15:35:57 crc kubenswrapper[4676]: E1204 15:35:57.535295 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:fd917de0cf800ec284ee0c3f2906a06d85ea18cb75a5b06c8eb305750467986d\\\"\"" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" podUID="53683a17-2c47-4b4c-b145-74620d4d7a16" Dec 04 15:35:57 crc kubenswrapper[4676]: E1204 15:35:57.535408 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:66928f0eae5206f671ac7b21f79953e37009c54187d768dc6e03fe0a3d202b3b\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" podUID="8a67582d-5c84-40fc-977b-4c0d42d9864b" Dec 04 15:35:57 crc kubenswrapper[4676]: E1204 15:35:57.535448 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2c4fe20e044dd8ea1f60f2f3f5e3844d932b4b79439835bd8771c73f16b38312\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" podUID="ee1e0a33-feb5-4a3b-8d62-dca835529d5e" Dec 04 15:35:57 crc kubenswrapper[4676]: E1204 15:35:57.535477 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:5245e851b4476baecd4173eca3e8669ac09ec69d36ad1ebc3a0f867713cbc14b\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" podUID="9890ab17-b307-4506-9420-0a50e671792e" Dec 04 15:36:03 crc kubenswrapper[4676]: I1204 15:36:03.612171 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6b8756448-bqf62" Dec 04 15:36:10 crc kubenswrapper[4676]: I1204 15:36:10.768499 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gld6t"] Dec 04 15:36:10 crc kubenswrapper[4676]: E1204 15:36:10.770539 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="extract-content" Dec 04 15:36:10 crc kubenswrapper[4676]: I1204 15:36:10.770685 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="extract-content" Dec 04 15:36:10 crc kubenswrapper[4676]: E1204 15:36:10.770808 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="extract-utilities" Dec 04 15:36:10 crc kubenswrapper[4676]: I1204 15:36:10.770885 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="extract-utilities" Dec 04 15:36:10 crc kubenswrapper[4676]: E1204 15:36:10.770968 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="registry-server" Dec 04 15:36:10 crc kubenswrapper[4676]: I1204 15:36:10.771031 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="registry-server" Dec 04 15:36:10 crc kubenswrapper[4676]: I1204 15:36:10.771309 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="395ffe4b-ade5-4326-8a64-03892c41efd7" containerName="registry-server" Dec 04 15:36:10 crc kubenswrapper[4676]: I1204 15:36:10.773200 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:10 crc kubenswrapper[4676]: I1204 15:36:10.783242 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gld6t"] Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.000087 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mql5z\" (UniqueName: \"kubernetes.io/projected/a1089f57-2ca2-48db-b638-33223ceae381-kube-api-access-mql5z\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.000143 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-utilities\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.000282 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-catalog-content\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.101857 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-catalog-content\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.102003 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mql5z\" (UniqueName: \"kubernetes.io/projected/a1089f57-2ca2-48db-b638-33223ceae381-kube-api-access-mql5z\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.102305 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-utilities\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.102551 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-catalog-content\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.102678 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-utilities\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.125043 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mql5z\" (UniqueName: \"kubernetes.io/projected/a1089f57-2ca2-48db-b638-33223ceae381-kube-api-access-mql5z\") pod \"redhat-marketplace-gld6t\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:11 crc kubenswrapper[4676]: I1204 15:36:11.200428 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:17 crc kubenswrapper[4676]: E1204 15:36:17.335465 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:4f799c74da2f1c864af24fcd5efd91ec64848972a95246eac6b5c6c4d71c1756" Dec 04 15:36:17 crc kubenswrapper[4676]: E1204 15:36:17.336416 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:4f799c74da2f1c864af24fcd5efd91ec64848972a95246eac6b5c6c4d71c1756,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dj9kr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-79cc9d59f5-tqc5z_openstack-operators(62a08aac-45ea-4944-9d7f-9d78114d07a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:36:17 crc kubenswrapper[4676]: E1204 15:36:17.839502 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:49180c7bd4f0071e43ae7044260a3a97c4aa34fcbcb2d0d4573df449765ed391" Dec 04 15:36:17 crc kubenswrapper[4676]: E1204 15:36:17.840031 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:49180c7bd4f0071e43ae7044260a3a97c4aa34fcbcb2d0d4573df449765ed391,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6v82w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-bb86466d8-x7nbg_openstack-operators(a2059da3-6c0d-4623-8406-5f25aed58fbf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:36:19 crc kubenswrapper[4676]: E1204 15:36:19.086014 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:57d9cb0034a7d5c7a39410fcb619ade2010e6855344dc3a0bc2bfd98cdf345d8" Dec 04 15:36:19 crc kubenswrapper[4676]: E1204 15:36:19.086650 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:57d9cb0034a7d5c7a39410fcb619ade2010e6855344dc3a0bc2bfd98cdf345d8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jr98j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5cbc8c7f96-lpl84_openstack-operators(d28e781c-96cf-4377-8cbc-f32b112e3dc7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:36:19 crc kubenswrapper[4676]: E1204 15:36:19.620729 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:f4b6baa2b8a661351cfc24fff5aacee5aa4198106618700cfa47ec3a75f88b31" Dec 04 15:36:19 crc kubenswrapper[4676]: E1204 15:36:19.621507 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:f4b6baa2b8a661351cfc24fff5aacee5aa4198106618700cfa47ec3a75f88b31,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kn8xh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-85fbd69fcd-7vsrd_openstack-operators(25a6adcc-b6f7-41ee-a0d3-9594455bedda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:36:26 crc kubenswrapper[4676]: I1204 15:36:26.142519 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gld6t"] Dec 04 15:36:26 crc kubenswrapper[4676]: W1204 15:36:26.489002 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1089f57_2ca2_48db_b638_33223ceae381.slice/crio-678d0f2693e197fdf6a0d8927bbf619a59128f2a1d106700ab16e285d2b9de27 WatchSource:0}: Error finding container 678d0f2693e197fdf6a0d8927bbf619a59128f2a1d106700ab16e285d2b9de27: Status 404 returned error can't find the container with id 678d0f2693e197fdf6a0d8927bbf619a59128f2a1d106700ab16e285d2b9de27 Dec 04 15:36:27 crc kubenswrapper[4676]: I1204 15:36:27.180037 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gld6t" event={"ID":"a1089f57-2ca2-48db-b638-33223ceae381","Type":"ContainerStarted","Data":"678d0f2693e197fdf6a0d8927bbf619a59128f2a1d106700ab16e285d2b9de27"} Dec 04 15:36:27 crc kubenswrapper[4676]: E1204 15:36:27.259530 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" podUID="d28e781c-96cf-4377-8cbc-f32b112e3dc7" Dec 04 15:36:27 crc kubenswrapper[4676]: E1204 15:36:27.443525 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" podUID="25a6adcc-b6f7-41ee-a0d3-9594455bedda" Dec 04 15:36:27 crc kubenswrapper[4676]: E1204 15:36:27.635631 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" podUID="a2059da3-6c0d-4623-8406-5f25aed58fbf" Dec 04 15:36:27 crc kubenswrapper[4676]: E1204 15:36:27.682296 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" podUID="62a08aac-45ea-4944-9d7f-9d78114d07a0" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.200951 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" event={"ID":"171288d7-22db-4357-bbfc-0f5ffa6b709c","Type":"ContainerStarted","Data":"cb003af39b73564d34d65978a77130e9b29f4e9c1cc62f070e142ca4876953d1"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.209698 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" event={"ID":"ee1e0a33-feb5-4a3b-8d62-dca835529d5e","Type":"ContainerStarted","Data":"ddf46e1691d98fcede70e66a141ecfb5be1453947c07c85fd1b92dbff529ddf8"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.210005 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.219395 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" event={"ID":"f5882b54-a120-4eff-88e8-bf0a5d7758ff","Type":"ContainerStarted","Data":"51f0a4236792d20fa540e9f45e62aab01a39709d3f9fa297eefcff259ed98b74"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.224162 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" event={"ID":"02e4b1ff-3345-4104-b333-cba2f5cd9388","Type":"ContainerStarted","Data":"c86404819b60363bf4a063764e6daca641b77e6159334c3abfc23ed4d0aa6284"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.241635 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" event={"ID":"9890ab17-b307-4506-9420-0a50e671792e","Type":"ContainerStarted","Data":"7a1c82096aaeacf320e8a0107879a5141ef4f5b95019010b6a533cd3841fdf30"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.242463 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.253326 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" event={"ID":"a2059da3-6c0d-4623-8406-5f25aed58fbf","Type":"ContainerStarted","Data":"5e66ce9ddb765cf5a00e7bb0c57f6403a737344a37412615c8686b4f62d03fe4"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.272639 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" podStartSLOduration=5.801484123 podStartE2EDuration="37.272593239s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.023254662 +0000 UTC m=+962.457924519" lastFinishedPulling="2025-12-04 15:36:26.494363778 +0000 UTC m=+993.929033635" observedRunningTime="2025-12-04 15:36:28.271551279 +0000 UTC m=+995.706221146" watchObservedRunningTime="2025-12-04 15:36:28.272593239 +0000 UTC m=+995.707263096" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.284075 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" event={"ID":"66135fe6-10ac-4049-b7a7-e40aa82f78e7","Type":"ContainerStarted","Data":"9d4aecf1764040e88e89eb81b01d38a9a751698fd85c84daac4b15d891898bff"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.285105 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.316486 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" event={"ID":"93e0c78f-854f-4c11-b457-f5e1b429a7bc","Type":"ContainerStarted","Data":"f971a17e183e7c14a694266d7ad2fbd6733c89fd42a5001898744dea5b684446"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.317435 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.320720 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" event={"ID":"1b01dbe4-9e3e-403e-938a-22f130b47202","Type":"ContainerStarted","Data":"6a674c86fcd34a3d3d27d846507a70e2395ae3e776826a75587414034b45bdff"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.365971 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" event={"ID":"d373173f-fba9-4fc1-9d7d-5424dca0303e","Type":"ContainerStarted","Data":"eda2637f85065244247fd885d2ff924d5513f5a57d49ddf20ead78ac7ab112f2"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.366038 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" event={"ID":"d373173f-fba9-4fc1-9d7d-5424dca0303e","Type":"ContainerStarted","Data":"fabe9fbee41a5721b1e4d7045d86e867bcd3f09f0e812aa08abd0d30bb7facd7"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.366536 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.426634 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" podStartSLOduration=6.279418828 podStartE2EDuration="37.426616419s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.351635676 +0000 UTC m=+962.786305533" lastFinishedPulling="2025-12-04 15:36:26.498833267 +0000 UTC m=+993.933503124" observedRunningTime="2025-12-04 15:36:28.381264945 +0000 UTC m=+995.815934802" watchObservedRunningTime="2025-12-04 15:36:28.426616419 +0000 UTC m=+995.861286276" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.435084 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" podStartSLOduration=5.292506715 podStartE2EDuration="36.435068662s" podCreationTimestamp="2025-12-04 15:35:52 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.351790481 +0000 UTC m=+962.786460338" lastFinishedPulling="2025-12-04 15:36:26.494352428 +0000 UTC m=+993.929022285" observedRunningTime="2025-12-04 15:36:28.432287202 +0000 UTC m=+995.866957059" watchObservedRunningTime="2025-12-04 15:36:28.435068662 +0000 UTC m=+995.869738519" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.466298 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" event={"ID":"7d5162d9-add8-44b3-8301-82cbd7d09878","Type":"ContainerStarted","Data":"9875d7f1eefb14b05e49b5e1e4b0b1f4fe3797141080c5843b0c9921bb422e21"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.479206 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" podStartSLOduration=12.867706713 podStartE2EDuration="37.479182941s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.003225226 +0000 UTC m=+962.437895083" lastFinishedPulling="2025-12-04 15:36:19.614701454 +0000 UTC m=+987.049371311" observedRunningTime="2025-12-04 15:36:28.472577131 +0000 UTC m=+995.907247018" watchObservedRunningTime="2025-12-04 15:36:28.479182941 +0000 UTC m=+995.913852798" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.504963 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" event={"ID":"25a6adcc-b6f7-41ee-a0d3-9594455bedda","Type":"ContainerStarted","Data":"5eb5862ff8ec30d0c555144021625654fd34f3e9af03cfac2e524b1dc3686999"} Dec 04 15:36:28 crc kubenswrapper[4676]: E1204 15:36:28.506904 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:f4b6baa2b8a661351cfc24fff5aacee5aa4198106618700cfa47ec3a75f88b31\\\"\"" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" podUID="25a6adcc-b6f7-41ee-a0d3-9594455bedda" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.522232 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" event={"ID":"c7bf3f72-274b-4db9-8822-25999acad8b6","Type":"ContainerStarted","Data":"62c04093eaba955627ae119f7ce80a66e1e7469fde13d33f3a7c0a8ff205898b"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.557536 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" event={"ID":"255159ec-7751-4663-a0b9-0e97f9c0824d","Type":"ContainerStarted","Data":"de53c3af3a48c3b2d68a824e7982c16fc3463ca61d5c900616faab334b1fef72"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.586285 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" podStartSLOduration=5.063273202 podStartE2EDuration="36.5862596s" podCreationTimestamp="2025-12-04 15:35:52 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.320350637 +0000 UTC m=+962.755020494" lastFinishedPulling="2025-12-04 15:36:26.843337025 +0000 UTC m=+994.278006892" observedRunningTime="2025-12-04 15:36:28.522573829 +0000 UTC m=+995.957243686" watchObservedRunningTime="2025-12-04 15:36:28.5862596 +0000 UTC m=+996.020929457" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.588464 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" event={"ID":"db83cc98-e9f7-4c8a-989a-ad3150de91b9","Type":"ContainerStarted","Data":"677a8e9ee0bcbad0ca3d542fed8d3cbdc46417ec1811a8c3d32aac1b9f6d3cb7"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.598780 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" event={"ID":"62a08aac-45ea-4944-9d7f-9d78114d07a0","Type":"ContainerStarted","Data":"b8cdf9fcebb0ee05b6ebaf14feeb2b70e845c195bda2fb9fece62edb4b37ded9"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.616597 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" event={"ID":"d28e781c-96cf-4377-8cbc-f32b112e3dc7","Type":"ContainerStarted","Data":"1957a1b895b9b63efd4aa8d94f39adbfe2c37c7a10d2b0ba8dea10acc0c10293"} Dec 04 15:36:28 crc kubenswrapper[4676]: E1204 15:36:28.618634 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:57d9cb0034a7d5c7a39410fcb619ade2010e6855344dc3a0bc2bfd98cdf345d8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" podUID="d28e781c-96cf-4377-8cbc-f32b112e3dc7" Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.624422 4676 generic.go:334] "Generic (PLEG): container finished" podID="a1089f57-2ca2-48db-b638-33223ceae381" containerID="14cad51e52c2db9f537a7503566c4cfa3856278fcc85aff5d7744134abdb49e7" exitCode=0 Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.624512 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gld6t" event={"ID":"a1089f57-2ca2-48db-b638-33223ceae381","Type":"ContainerDied","Data":"14cad51e52c2db9f537a7503566c4cfa3856278fcc85aff5d7744134abdb49e7"} Dec 04 15:36:28 crc kubenswrapper[4676]: I1204 15:36:28.638244 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" event={"ID":"dbba238e-b271-48f0-9356-c1ba4b7446f8","Type":"ContainerStarted","Data":"9f94ff1ee8b5c9f44b39d9bc65c349b551a6c3b17fa66e8ca6e0fab2d8ee8fcf"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.648523 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" event={"ID":"f5882b54-a120-4eff-88e8-bf0a5d7758ff","Type":"ContainerStarted","Data":"aea83889c5609fde88b9b301d8c6544af02df2aa210df31112d09da6618085df"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.649665 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.652113 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" event={"ID":"255159ec-7751-4663-a0b9-0e97f9c0824d","Type":"ContainerStarted","Data":"22c97bc46d711721538b7f678a959f180cdca1a169efe8d8f08a07fc5b25931c"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.653024 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.656244 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" event={"ID":"7d5162d9-add8-44b3-8301-82cbd7d09878","Type":"ContainerStarted","Data":"1884a9dd413ea2358576a1edb40bca0dd3b6ffa5abac02d8fad371e950bf4338"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.657074 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.660917 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" event={"ID":"dbba238e-b271-48f0-9356-c1ba4b7446f8","Type":"ContainerStarted","Data":"60458efb6f60c73283b13377834e94063ceb1776f31433a3a1e15495b20ba32f"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.661637 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.663405 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" event={"ID":"3b483864-ee9a-49b1-b75f-5f9b23e9534d","Type":"ContainerStarted","Data":"5d1bf3466f72e224d8b67891a689f12b7499e8295c10a75348c4f5ea0abe5952"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.665109 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" event={"ID":"53683a17-2c47-4b4c-b145-74620d4d7a16","Type":"ContainerStarted","Data":"62731a74ecc0d9886612450d65f1d210e701ee245c14d50abdd6b1d42bdecbae"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.665596 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.667589 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" event={"ID":"8a67582d-5c84-40fc-977b-4c0d42d9864b","Type":"ContainerStarted","Data":"b70845d8d7bf2b6dec0e265b654166ec69fc9d685c23899f066ed7b0ed4247ba"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.668249 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.671471 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" event={"ID":"1b01dbe4-9e3e-403e-938a-22f130b47202","Type":"ContainerStarted","Data":"8c2624576aae3e2432c820f41a2c9eac3efab783b39c85976cda0867b538cbcb"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.672156 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.675350 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" event={"ID":"191599a4-dee2-4d6c-b7ba-09e4f60faaf5","Type":"ContainerStarted","Data":"ad0a29e8c0304793b85dc1545b0d045c78164e84c325e34c8addf167b917a17c"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.675389 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" event={"ID":"191599a4-dee2-4d6c-b7ba-09e4f60faaf5","Type":"ContainerStarted","Data":"d449546320d4f380d9b8d1bd414c49d196c452d6eb952bf1f9058be0eb7637b2"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.676056 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.678314 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" event={"ID":"a2059da3-6c0d-4623-8406-5f25aed58fbf","Type":"ContainerStarted","Data":"981ca1c3aaed792b307328d3575d0088f549e98242fdbe5aab80032e226f3561"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.678403 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.680441 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" event={"ID":"62a08aac-45ea-4944-9d7f-9d78114d07a0","Type":"ContainerStarted","Data":"73e04dfe06bfd8d009f3ed22359b188028d02705c215694925420a7c1fd79d13"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.681066 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.682925 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" event={"ID":"c7bf3f72-274b-4db9-8822-25999acad8b6","Type":"ContainerStarted","Data":"800bff26bc7973a92306d05a6dc6d8bcf5b98d4e39c0e4cf9c47fb670f8d78bd"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.683070 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.686036 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" event={"ID":"02e4b1ff-3345-4104-b333-cba2f5cd9388","Type":"ContainerStarted","Data":"012435b96cf2861db83f83b70ead228b92a854e08e5186d89a31cc3a62a2ba28"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.686094 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.688165 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" event={"ID":"171288d7-22db-4357-bbfc-0f5ffa6b709c","Type":"ContainerStarted","Data":"1de79ef8357916ad90a9ff71e22f7221f9cd4f46863a3d5e14c11b0e42a17949"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.688332 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.695228 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" event={"ID":"db83cc98-e9f7-4c8a-989a-ad3150de91b9","Type":"ContainerStarted","Data":"48e9f5abf7cb933acab856cb8e552fd1aa09887e6798844c1a872d50ed37d2ad"} Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.695273 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.723265 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" podStartSLOduration=14.106613813 podStartE2EDuration="38.723227199s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.995527835 +0000 UTC m=+962.430197692" lastFinishedPulling="2025-12-04 15:36:19.612141221 +0000 UTC m=+987.046811078" observedRunningTime="2025-12-04 15:36:29.689009495 +0000 UTC m=+997.123679372" watchObservedRunningTime="2025-12-04 15:36:29.723227199 +0000 UTC m=+997.157897056" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.774089 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ngw55" podStartSLOduration=6.307145695 podStartE2EDuration="37.774067971s" podCreationTimestamp="2025-12-04 15:35:52 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.351393299 +0000 UTC m=+962.786063156" lastFinishedPulling="2025-12-04 15:36:26.818315575 +0000 UTC m=+994.252985432" observedRunningTime="2025-12-04 15:36:29.763119076 +0000 UTC m=+997.197788953" watchObservedRunningTime="2025-12-04 15:36:29.774067971 +0000 UTC m=+997.208737828" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.799773 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" podStartSLOduration=7.619347484 podStartE2EDuration="38.79975298s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.315194968 +0000 UTC m=+962.749864825" lastFinishedPulling="2025-12-04 15:36:26.495600454 +0000 UTC m=+993.930270321" observedRunningTime="2025-12-04 15:36:29.795538089 +0000 UTC m=+997.230207946" watchObservedRunningTime="2025-12-04 15:36:29.79975298 +0000 UTC m=+997.234422837" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.840461 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" podStartSLOduration=4.66679059 podStartE2EDuration="38.8404435s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.977683572 +0000 UTC m=+962.412353429" lastFinishedPulling="2025-12-04 15:36:29.151336482 +0000 UTC m=+996.586006339" observedRunningTime="2025-12-04 15:36:29.829438064 +0000 UTC m=+997.264107911" watchObservedRunningTime="2025-12-04 15:36:29.8404435 +0000 UTC m=+997.275113357" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.862286 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" podStartSLOduration=13.380444338 podStartE2EDuration="38.862266647s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.994257178 +0000 UTC m=+962.428927035" lastFinishedPulling="2025-12-04 15:36:20.476079487 +0000 UTC m=+987.910749344" observedRunningTime="2025-12-04 15:36:29.859475886 +0000 UTC m=+997.294145763" watchObservedRunningTime="2025-12-04 15:36:29.862266647 +0000 UTC m=+997.296936504" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.888736 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" podStartSLOduration=12.206819495 podStartE2EDuration="38.888709307s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.994550457 +0000 UTC m=+962.429220314" lastFinishedPulling="2025-12-04 15:36:21.676440269 +0000 UTC m=+989.111110126" observedRunningTime="2025-12-04 15:36:29.88533593 +0000 UTC m=+997.320005817" watchObservedRunningTime="2025-12-04 15:36:29.888709307 +0000 UTC m=+997.323379164" Dec 04 15:36:29 crc kubenswrapper[4676]: I1204 15:36:29.916941 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" podStartSLOduration=4.356559127 podStartE2EDuration="37.916898758s" podCreationTimestamp="2025-12-04 15:35:52 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.312748988 +0000 UTC m=+962.747418845" lastFinishedPulling="2025-12-04 15:36:28.873088629 +0000 UTC m=+996.307758476" observedRunningTime="2025-12-04 15:36:29.915471407 +0000 UTC m=+997.350141264" watchObservedRunningTime="2025-12-04 15:36:29.916898758 +0000 UTC m=+997.351568615" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.163384 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" podStartSLOduration=14.028834045 podStartE2EDuration="39.163363046s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.479110303 +0000 UTC m=+961.913780150" lastFinishedPulling="2025-12-04 15:36:19.613639294 +0000 UTC m=+987.048309151" observedRunningTime="2025-12-04 15:36:30.160833323 +0000 UTC m=+997.595503190" watchObservedRunningTime="2025-12-04 15:36:30.163363046 +0000 UTC m=+997.598032903" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.195448 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" podStartSLOduration=13.155759506 podStartE2EDuration="39.195424218s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:53.573764546 +0000 UTC m=+961.008434403" lastFinishedPulling="2025-12-04 15:36:19.613429258 +0000 UTC m=+987.048099115" observedRunningTime="2025-12-04 15:36:30.189864228 +0000 UTC m=+997.624534115" watchObservedRunningTime="2025-12-04 15:36:30.195424218 +0000 UTC m=+997.630094075" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.229133 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" podStartSLOduration=8.085279913 podStartE2EDuration="39.227912652s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.350991408 +0000 UTC m=+962.785661265" lastFinishedPulling="2025-12-04 15:36:26.493624147 +0000 UTC m=+993.928294004" observedRunningTime="2025-12-04 15:36:30.221835388 +0000 UTC m=+997.656505245" watchObservedRunningTime="2025-12-04 15:36:30.227912652 +0000 UTC m=+997.662582519" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.256702 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" podStartSLOduration=14.645409378 podStartE2EDuration="39.25667772s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.00299614 +0000 UTC m=+962.437665997" lastFinishedPulling="2025-12-04 15:36:19.614264482 +0000 UTC m=+987.048934339" observedRunningTime="2025-12-04 15:36:30.252058927 +0000 UTC m=+997.686728824" watchObservedRunningTime="2025-12-04 15:36:30.25667772 +0000 UTC m=+997.691347577" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.276909 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" podStartSLOduration=14.146470108 podStartE2EDuration="39.276888821s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.482590763 +0000 UTC m=+961.917260620" lastFinishedPulling="2025-12-04 15:36:19.613009476 +0000 UTC m=+987.047679333" observedRunningTime="2025-12-04 15:36:30.274868043 +0000 UTC m=+997.709537890" watchObservedRunningTime="2025-12-04 15:36:30.276888821 +0000 UTC m=+997.711558678" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.299569 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" podStartSLOduration=14.987465395 podStartE2EDuration="39.299544023s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.301983578 +0000 UTC m=+962.736653435" lastFinishedPulling="2025-12-04 15:36:19.614062206 +0000 UTC m=+987.048732063" observedRunningTime="2025-12-04 15:36:30.296627459 +0000 UTC m=+997.731297306" watchObservedRunningTime="2025-12-04 15:36:30.299544023 +0000 UTC m=+997.734213880" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.328853 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" podStartSLOduration=14.720041744 podStartE2EDuration="39.328827605s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.006434908 +0000 UTC m=+962.441104755" lastFinishedPulling="2025-12-04 15:36:19.615220759 +0000 UTC m=+987.049890616" observedRunningTime="2025-12-04 15:36:30.325151389 +0000 UTC m=+997.759821246" watchObservedRunningTime="2025-12-04 15:36:30.328827605 +0000 UTC m=+997.763497482" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.403383 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" podStartSLOduration=14.281359358 podStartE2EDuration="39.403362948s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.490300655 +0000 UTC m=+961.924970512" lastFinishedPulling="2025-12-04 15:36:19.612304245 +0000 UTC m=+987.046974102" observedRunningTime="2025-12-04 15:36:30.400178637 +0000 UTC m=+997.834848514" watchObservedRunningTime="2025-12-04 15:36:30.403362948 +0000 UTC m=+997.838032795" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.706058 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" event={"ID":"25a6adcc-b6f7-41ee-a0d3-9594455bedda","Type":"ContainerStarted","Data":"734545a68b928ad0825aaa68621deca1ecb2bc4d62c1166c0459f89558ef46be"} Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.707477 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.708944 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" event={"ID":"d28e781c-96cf-4377-8cbc-f32b112e3dc7","Type":"ContainerStarted","Data":"a682b7c1e14c9260bcc0ad7f3792fd7a7726ea2bbfd438026e00fe805864dfdb"} Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.709147 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.717890 4676 generic.go:334] "Generic (PLEG): container finished" podID="a1089f57-2ca2-48db-b638-33223ceae381" containerID="809c3646c9862ae49b291adde7b8e5028a3b3f51510b514b28add0d55da3ca29" exitCode=0 Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.718938 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gld6t" event={"ID":"a1089f57-2ca2-48db-b638-33223ceae381","Type":"ContainerDied","Data":"809c3646c9862ae49b291adde7b8e5028a3b3f51510b514b28add0d55da3ca29"} Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.737095 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" podStartSLOduration=4.37469042 podStartE2EDuration="39.737068386s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:54.978427653 +0000 UTC m=+962.413097520" lastFinishedPulling="2025-12-04 15:36:30.340805629 +0000 UTC m=+997.775475486" observedRunningTime="2025-12-04 15:36:30.730927919 +0000 UTC m=+998.165597786" watchObservedRunningTime="2025-12-04 15:36:30.737068386 +0000 UTC m=+998.171738253" Dec 04 15:36:30 crc kubenswrapper[4676]: I1204 15:36:30.785609 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" podStartSLOduration=4.44323003 podStartE2EDuration="39.785584991s" podCreationTimestamp="2025-12-04 15:35:51 +0000 UTC" firstStartedPulling="2025-12-04 15:35:55.002635519 +0000 UTC m=+962.437305376" lastFinishedPulling="2025-12-04 15:36:30.34499048 +0000 UTC m=+997.779660337" observedRunningTime="2025-12-04 15:36:30.781548555 +0000 UTC m=+998.216218432" watchObservedRunningTime="2025-12-04 15:36:30.785584991 +0000 UTC m=+998.220254848" Dec 04 15:36:31 crc kubenswrapper[4676]: I1204 15:36:31.728681 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gld6t" event={"ID":"a1089f57-2ca2-48db-b638-33223ceae381","Type":"ContainerStarted","Data":"bc3243730d453c1fcdef1b4bb7ed5a6a4b1d5f50d883527e6e1505fe25865244"} Dec 04 15:36:31 crc kubenswrapper[4676]: I1204 15:36:31.773637 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gld6t" podStartSLOduration=19.282041141 podStartE2EDuration="21.773612486s" podCreationTimestamp="2025-12-04 15:36:10 +0000 UTC" firstStartedPulling="2025-12-04 15:36:28.626641632 +0000 UTC m=+996.061311489" lastFinishedPulling="2025-12-04 15:36:31.118212397 +0000 UTC m=+998.552882834" observedRunningTime="2025-12-04 15:36:31.755370292 +0000 UTC m=+999.190040159" watchObservedRunningTime="2025-12-04 15:36:31.773612486 +0000 UTC m=+999.208282343" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.167454 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6788cc6d75-hrr7c" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.176778 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7d5d9fd47f-vf7rm" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.506606 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-58879495c-g2b6v" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.509515 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-54485f899-zgqv7" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.557008 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79d658b66d-nxgnw" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.649877 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-d5fb87cb8-tg7br" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.674447 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5b67cfc8fb-7g426" Dec 04 15:36:32 crc kubenswrapper[4676]: I1204 15:36:32.974501 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-8f6687c44-24pgj" Dec 04 15:36:33 crc kubenswrapper[4676]: I1204 15:36:33.420795 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-695797c565-mtxgd" Dec 04 15:36:33 crc kubenswrapper[4676]: I1204 15:36:33.503971 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77868f484-f29bx" Dec 04 15:36:33 crc kubenswrapper[4676]: I1204 15:36:33.586345 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c44f899f9-n7xc5" Dec 04 15:36:41 crc kubenswrapper[4676]: I1204 15:36:41.201555 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:41 crc kubenswrapper[4676]: I1204 15:36:41.203180 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:41 crc kubenswrapper[4676]: I1204 15:36:41.243100 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:41 crc kubenswrapper[4676]: I1204 15:36:41.929951 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:41 crc kubenswrapper[4676]: I1204 15:36:41.957793 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-5bfbbb859d-p52sj" Dec 04 15:36:41 crc kubenswrapper[4676]: I1204 15:36:41.960626 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-698d6fd7d6-h74fd" Dec 04 15:36:41 crc kubenswrapper[4676]: I1204 15:36:41.978285 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-748967c98-zbsm7" Dec 04 15:36:42 crc kubenswrapper[4676]: I1204 15:36:42.074942 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-85fbd69fcd-7vsrd" Dec 04 15:36:42 crc kubenswrapper[4676]: I1204 15:36:42.450227 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-64d7c556cd-5nstv" Dec 04 15:36:42 crc kubenswrapper[4676]: I1204 15:36:42.468509 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6c55d8d69b-jjrmb" Dec 04 15:36:42 crc kubenswrapper[4676]: I1204 15:36:42.484396 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gld6t"] Dec 04 15:36:42 crc kubenswrapper[4676]: I1204 15:36:42.504534 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5cbc8c7f96-lpl84" Dec 04 15:36:42 crc kubenswrapper[4676]: I1204 15:36:42.557719 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-79cc9d59f5-tqc5z" Dec 04 15:36:42 crc kubenswrapper[4676]: I1204 15:36:42.914491 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-867d87977b-t4p48" Dec 04 15:36:43 crc kubenswrapper[4676]: I1204 15:36:43.561193 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-bb86466d8-x7nbg" Dec 04 15:36:43 crc kubenswrapper[4676]: I1204 15:36:43.901705 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gld6t" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="registry-server" containerID="cri-o://bc3243730d453c1fcdef1b4bb7ed5a6a4b1d5f50d883527e6e1505fe25865244" gracePeriod=2 Dec 04 15:36:48 crc kubenswrapper[4676]: I1204 15:36:48.943119 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gld6t_a1089f57-2ca2-48db-b638-33223ceae381/registry-server/0.log" Dec 04 15:36:48 crc kubenswrapper[4676]: I1204 15:36:48.944807 4676 generic.go:334] "Generic (PLEG): container finished" podID="a1089f57-2ca2-48db-b638-33223ceae381" containerID="bc3243730d453c1fcdef1b4bb7ed5a6a4b1d5f50d883527e6e1505fe25865244" exitCode=137 Dec 04 15:36:48 crc kubenswrapper[4676]: I1204 15:36:48.944944 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gld6t" event={"ID":"a1089f57-2ca2-48db-b638-33223ceae381","Type":"ContainerDied","Data":"bc3243730d453c1fcdef1b4bb7ed5a6a4b1d5f50d883527e6e1505fe25865244"} Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.318746 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gld6t_a1089f57-2ca2-48db-b638-33223ceae381/registry-server/0.log" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.320313 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.496074 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-catalog-content\") pod \"a1089f57-2ca2-48db-b638-33223ceae381\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.496165 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-utilities\") pod \"a1089f57-2ca2-48db-b638-33223ceae381\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.496218 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mql5z\" (UniqueName: \"kubernetes.io/projected/a1089f57-2ca2-48db-b638-33223ceae381-kube-api-access-mql5z\") pod \"a1089f57-2ca2-48db-b638-33223ceae381\" (UID: \"a1089f57-2ca2-48db-b638-33223ceae381\") " Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.497114 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-utilities" (OuterVolumeSpecName: "utilities") pod "a1089f57-2ca2-48db-b638-33223ceae381" (UID: "a1089f57-2ca2-48db-b638-33223ceae381"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.502169 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1089f57-2ca2-48db-b638-33223ceae381-kube-api-access-mql5z" (OuterVolumeSpecName: "kube-api-access-mql5z") pod "a1089f57-2ca2-48db-b638-33223ceae381" (UID: "a1089f57-2ca2-48db-b638-33223ceae381"). InnerVolumeSpecName "kube-api-access-mql5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.519993 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1089f57-2ca2-48db-b638-33223ceae381" (UID: "a1089f57-2ca2-48db-b638-33223ceae381"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.597794 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.597833 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mql5z\" (UniqueName: \"kubernetes.io/projected/a1089f57-2ca2-48db-b638-33223ceae381-kube-api-access-mql5z\") on node \"crc\" DevicePath \"\"" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.597865 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1089f57-2ca2-48db-b638-33223ceae381-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.962081 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-gld6t_a1089f57-2ca2-48db-b638-33223ceae381/registry-server/0.log" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.963074 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gld6t" event={"ID":"a1089f57-2ca2-48db-b638-33223ceae381","Type":"ContainerDied","Data":"678d0f2693e197fdf6a0d8927bbf619a59128f2a1d106700ab16e285d2b9de27"} Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.963181 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gld6t" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.963960 4676 scope.go:117] "RemoveContainer" containerID="bc3243730d453c1fcdef1b4bb7ed5a6a4b1d5f50d883527e6e1505fe25865244" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.983850 4676 scope.go:117] "RemoveContainer" containerID="809c3646c9862ae49b291adde7b8e5028a3b3f51510b514b28add0d55da3ca29" Dec 04 15:36:50 crc kubenswrapper[4676]: I1204 15:36:50.996171 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gld6t"] Dec 04 15:36:51 crc kubenswrapper[4676]: I1204 15:36:51.009219 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gld6t"] Dec 04 15:36:51 crc kubenswrapper[4676]: I1204 15:36:51.019616 4676 scope.go:117] "RemoveContainer" containerID="14cad51e52c2db9f537a7503566c4cfa3856278fcc85aff5d7744134abdb49e7" Dec 04 15:36:51 crc kubenswrapper[4676]: I1204 15:36:51.393207 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1089f57-2ca2-48db-b638-33223ceae381" path="/var/lib/kubelet/pods/a1089f57-2ca2-48db-b638-33223ceae381/volumes" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.803670 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54879cc849-jgszv"] Dec 04 15:37:01 crc kubenswrapper[4676]: E1204 15:37:01.804762 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="registry-server" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.804796 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="registry-server" Dec 04 15:37:01 crc kubenswrapper[4676]: E1204 15:37:01.804825 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="extract-utilities" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.804831 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="extract-utilities" Dec 04 15:37:01 crc kubenswrapper[4676]: E1204 15:37:01.804859 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="extract-content" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.804866 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="extract-content" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.805060 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1089f57-2ca2-48db-b638-33223ceae381" containerName="registry-server" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.806094 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.810222 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.810338 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.810338 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.810432 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-94jlh" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.815477 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54879cc849-jgszv"] Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.877508 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb85897d5-zcmk6"] Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.883420 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.885832 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 04 15:37:01 crc kubenswrapper[4676]: I1204 15:37:01.890640 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb85897d5-zcmk6"] Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.046139 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tdrq\" (UniqueName: \"kubernetes.io/projected/16c9a348-9198-411c-960d-182d97d8d5f3-kube-api-access-2tdrq\") pod \"dnsmasq-dns-54879cc849-jgszv\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.046205 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-dns-svc\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.046440 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq79w\" (UniqueName: \"kubernetes.io/projected/08feaa9c-6136-4163-8c17-c123473d4aef-kube-api-access-nq79w\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.046523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c9a348-9198-411c-960d-182d97d8d5f3-config\") pod \"dnsmasq-dns-54879cc849-jgszv\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.046626 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-config\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.148513 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c9a348-9198-411c-960d-182d97d8d5f3-config\") pod \"dnsmasq-dns-54879cc849-jgszv\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.148617 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-config\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.148672 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tdrq\" (UniqueName: \"kubernetes.io/projected/16c9a348-9198-411c-960d-182d97d8d5f3-kube-api-access-2tdrq\") pod \"dnsmasq-dns-54879cc849-jgszv\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.148702 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-dns-svc\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.148774 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq79w\" (UniqueName: \"kubernetes.io/projected/08feaa9c-6136-4163-8c17-c123473d4aef-kube-api-access-nq79w\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.149760 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c9a348-9198-411c-960d-182d97d8d5f3-config\") pod \"dnsmasq-dns-54879cc849-jgszv\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.150193 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-config\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.150431 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-dns-svc\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.170465 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tdrq\" (UniqueName: \"kubernetes.io/projected/16c9a348-9198-411c-960d-182d97d8d5f3-kube-api-access-2tdrq\") pod \"dnsmasq-dns-54879cc849-jgszv\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.171498 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq79w\" (UniqueName: \"kubernetes.io/projected/08feaa9c-6136-4163-8c17-c123473d4aef-kube-api-access-nq79w\") pod \"dnsmasq-dns-6cb85897d5-zcmk6\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.211679 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.434469 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.674264 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb85897d5-zcmk6"] Dec 04 15:37:02 crc kubenswrapper[4676]: I1204 15:37:02.838932 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54879cc849-jgszv"] Dec 04 15:37:02 crc kubenswrapper[4676]: W1204 15:37:02.850059 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16c9a348_9198_411c_960d_182d97d8d5f3.slice/crio-e7d5d0c439c32f6f054a1617aaa42d882dc3fb8c9d8311a37ee54dbab58baed9 WatchSource:0}: Error finding container e7d5d0c439c32f6f054a1617aaa42d882dc3fb8c9d8311a37ee54dbab58baed9: Status 404 returned error can't find the container with id e7d5d0c439c32f6f054a1617aaa42d882dc3fb8c9d8311a37ee54dbab58baed9 Dec 04 15:37:03 crc kubenswrapper[4676]: I1204 15:37:03.070451 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" event={"ID":"08feaa9c-6136-4163-8c17-c123473d4aef","Type":"ContainerStarted","Data":"edc9d17095515eeb1fdb1af66f3c6e4f46169fb8ed2c1d7806e0449af8821065"} Dec 04 15:37:03 crc kubenswrapper[4676]: I1204 15:37:03.072719 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54879cc849-jgszv" event={"ID":"16c9a348-9198-411c-960d-182d97d8d5f3","Type":"ContainerStarted","Data":"e7d5d0c439c32f6f054a1617aaa42d882dc3fb8c9d8311a37ee54dbab58baed9"} Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.003142 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54879cc849-jgszv"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.031328 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb647867c-7vc6x"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.032630 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.045440 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb647867c-7vc6x"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.335782 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-config\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.335975 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kb9\" (UniqueName: \"kubernetes.io/projected/09d23694-3775-496d-ba9a-888abb40ea10-kube-api-access-67kb9\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.336159 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-dns-svc\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.437747 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-config\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.437840 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67kb9\" (UniqueName: \"kubernetes.io/projected/09d23694-3775-496d-ba9a-888abb40ea10-kube-api-access-67kb9\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.437929 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-dns-svc\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.439150 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-dns-svc\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.440344 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-config\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.485175 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kb9\" (UniqueName: \"kubernetes.io/projected/09d23694-3775-496d-ba9a-888abb40ea10-kube-api-access-67kb9\") pod \"dnsmasq-dns-6bb647867c-7vc6x\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.490890 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb85897d5-zcmk6"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.517038 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c858cc7bf-2k42f"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.517384 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.519301 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.540585 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z54w\" (UniqueName: \"kubernetes.io/projected/e6b4fb4d-9b61-414f-a78c-71a143c965d2-kube-api-access-4z54w\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.540681 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-dns-svc\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.540712 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-config\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.543413 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c858cc7bf-2k42f"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.641727 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-dns-svc\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.641781 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-config\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.641856 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z54w\" (UniqueName: \"kubernetes.io/projected/e6b4fb4d-9b61-414f-a78c-71a143c965d2-kube-api-access-4z54w\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.643225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-config\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.643348 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-dns-svc\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.680933 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z54w\" (UniqueName: \"kubernetes.io/projected/e6b4fb4d-9b61-414f-a78c-71a143c965d2-kube-api-access-4z54w\") pod \"dnsmasq-dns-7c858cc7bf-2k42f\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.799346 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb647867c-7vc6x"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.837000 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b7696bc7-6t68r"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.838569 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.840563 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.843950 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-dns-svc\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.844038 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp248\" (UniqueName: \"kubernetes.io/projected/485b242f-88d0-4521-a25c-e9a957a58e19-kube-api-access-tp248\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.844133 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-config\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.844449 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b7696bc7-6t68r"] Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.945078 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-config\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.945165 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-dns-svc\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.945651 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp248\" (UniqueName: \"kubernetes.io/projected/485b242f-88d0-4521-a25c-e9a957a58e19-kube-api-access-tp248\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.947236 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-dns-svc\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.947236 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-config\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:06 crc kubenswrapper[4676]: I1204 15:37:06.968765 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp248\" (UniqueName: \"kubernetes.io/projected/485b242f-88d0-4521-a25c-e9a957a58e19-kube-api-access-tp248\") pod \"dnsmasq-dns-8b7696bc7-6t68r\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.158265 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.340050 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.344207 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.347796 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.348148 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.348299 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g2s2x" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.348464 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.353240 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.361284 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.361342 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.361583 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.463300 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.463352 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.463475 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.463588 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.463652 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.463863 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.464046 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz2vc\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-kube-api-access-sz2vc\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.464094 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.464136 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.464189 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.464231 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566193 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566287 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566326 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566353 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566377 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566449 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566494 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz2vc\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-kube-api-access-sz2vc\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566522 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566550 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566606 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.566655 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.567266 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.568146 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.568362 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.568652 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-config-data\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.568858 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.569091 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.571262 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.573661 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.581749 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.583934 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.585136 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz2vc\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-kube-api-access-sz2vc\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.593822 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.637935 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.639765 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.644357 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hf49c" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.644600 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.645038 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.645502 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.646142 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.647051 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.647317 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.653985 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.674799 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771334 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771425 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771516 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743292d4-f5a5-48cd-bcb0-63fb95ac6910-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771796 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771854 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771884 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771946 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.771972 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743292d4-f5a5-48cd-bcb0-63fb95ac6910-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.772005 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.772038 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.772127 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqs4z\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-kube-api-access-mqs4z\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874032 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874078 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874096 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874114 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874134 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743292d4-f5a5-48cd-bcb0-63fb95ac6910-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874154 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874197 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874287 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqs4z\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-kube-api-access-mqs4z\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874333 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874379 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743292d4-f5a5-48cd-bcb0-63fb95ac6910-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874402 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.874654 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.875357 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.875394 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.875388 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.875379 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.876676 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.878784 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.879932 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743292d4-f5a5-48cd-bcb0-63fb95ac6910-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.881083 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.887428 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743292d4-f5a5-48cd-bcb0-63fb95ac6910-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.895406 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqs4z\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-kube-api-access-mqs4z\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.896878 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.946340 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.949297 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.953796 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.953853 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.953805 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.954109 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.954329 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.954392 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.955740 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-2pl2p" Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.965630 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 04 15:37:07 crc kubenswrapper[4676]: I1204 15:37:07.972029 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077106 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a074e2a9-e6e9-488d-8338-54231ab8faf9-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077197 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077244 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077302 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077376 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvzz\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-kube-api-access-ztvzz\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077442 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077478 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077520 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077574 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077599 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a074e2a9-e6e9-488d-8338-54231ab8faf9-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.077648 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.178840 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.178886 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a074e2a9-e6e9-488d-8338-54231ab8faf9-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.178931 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.178955 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a074e2a9-e6e9-488d-8338-54231ab8faf9-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.178980 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179001 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179040 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179071 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvzz\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-kube-api-access-ztvzz\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179097 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179120 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179140 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179402 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179853 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.179867 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.180540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.180590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.180952 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a074e2a9-e6e9-488d-8338-54231ab8faf9-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.182602 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a074e2a9-e6e9-488d-8338-54231ab8faf9-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.184086 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.192427 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a074e2a9-e6e9-488d-8338-54231ab8faf9-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.194066 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.198532 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvzz\" (UniqueName: \"kubernetes.io/projected/a074e2a9-e6e9-488d-8338-54231ab8faf9-kube-api-access-ztvzz\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.202423 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a074e2a9-e6e9-488d-8338-54231ab8faf9\") " pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:08 crc kubenswrapper[4676]: I1204 15:37:08.274428 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.604151 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.605657 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.609178 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.609844 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.610010 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.610617 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.611030 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tqt7k" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.617520 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.621534 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.712545 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.712616 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.712700 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrbtw\" (UniqueName: \"kubernetes.io/projected/3588a213-92d7-43d7-8a28-6a9104f1d48e-kube-api-access-qrbtw\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.712956 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3588a213-92d7-43d7-8a28-6a9104f1d48e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.713049 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.713131 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-secrets\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.713153 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.713192 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.713251 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814395 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-secrets\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814435 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814477 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814503 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814538 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814568 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814596 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrbtw\" (UniqueName: \"kubernetes.io/projected/3588a213-92d7-43d7-8a28-6a9104f1d48e-kube-api-access-qrbtw\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814644 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3588a213-92d7-43d7-8a28-6a9104f1d48e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.814696 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.816263 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.816937 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.817417 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3588a213-92d7-43d7-8a28-6a9104f1d48e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.817473 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.817839 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3588a213-92d7-43d7-8a28-6a9104f1d48e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.829349 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.834826 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrbtw\" (UniqueName: \"kubernetes.io/projected/3588a213-92d7-43d7-8a28-6a9104f1d48e-kube-api-access-qrbtw\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.841035 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-secrets\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.841400 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3588a213-92d7-43d7-8a28-6a9104f1d48e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.843188 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3588a213-92d7-43d7-8a28-6a9104f1d48e\") " pod="openstack/openstack-galera-0" Dec 04 15:37:09 crc kubenswrapper[4676]: I1204 15:37:09.952737 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 04 15:37:10 crc kubenswrapper[4676]: I1204 15:37:10.954329 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 15:37:10 crc kubenswrapper[4676]: I1204 15:37:10.956317 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:10 crc kubenswrapper[4676]: I1204 15:37:10.959594 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 04 15:37:10 crc kubenswrapper[4676]: I1204 15:37:10.959633 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 04 15:37:10 crc kubenswrapper[4676]: I1204 15:37:10.959957 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-8qsdp" Dec 04 15:37:10 crc kubenswrapper[4676]: I1204 15:37:10.960990 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.004065 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.091666 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.091712 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c52ad2e5-0a77-4894-8535-30b4e98cdda9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.091742 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.091865 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.092014 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85qp9\" (UniqueName: \"kubernetes.io/projected/c52ad2e5-0a77-4894-8535-30b4e98cdda9-kube-api-access-85qp9\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.092058 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.092121 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.092173 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.092226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.193594 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.193740 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.193884 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c52ad2e5-0a77-4894-8535-30b4e98cdda9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.194343 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c52ad2e5-0a77-4894-8535-30b4e98cdda9-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.194547 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.194639 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.195237 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.195646 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85qp9\" (UniqueName: \"kubernetes.io/projected/c52ad2e5-0a77-4894-8535-30b4e98cdda9-kube-api-access-85qp9\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.195687 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.195742 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.195788 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.196243 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.197256 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.198372 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c52ad2e5-0a77-4894-8535-30b4e98cdda9-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.200221 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.207654 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.208380 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c52ad2e5-0a77-4894-8535-30b4e98cdda9-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.220988 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85qp9\" (UniqueName: \"kubernetes.io/projected/c52ad2e5-0a77-4894-8535-30b4e98cdda9-kube-api-access-85qp9\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.228885 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.230243 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.231633 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4fqdh" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.238094 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.241467 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.241496 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.272717 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c52ad2e5-0a77-4894-8535-30b4e98cdda9\") " pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.319221 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.398926 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/12baa943-6113-449f-ac06-88dd60e224fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.399010 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12baa943-6113-449f-ac06-88dd60e224fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.399053 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12baa943-6113-449f-ac06-88dd60e224fe-kolla-config\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.399340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12baa943-6113-449f-ac06-88dd60e224fe-config-data\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.399485 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdmvm\" (UniqueName: \"kubernetes.io/projected/12baa943-6113-449f-ac06-88dd60e224fe-kube-api-access-wdmvm\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.501598 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/12baa943-6113-449f-ac06-88dd60e224fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.501682 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12baa943-6113-449f-ac06-88dd60e224fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.501708 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12baa943-6113-449f-ac06-88dd60e224fe-kolla-config\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.502219 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12baa943-6113-449f-ac06-88dd60e224fe-config-data\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.502277 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdmvm\" (UniqueName: \"kubernetes.io/projected/12baa943-6113-449f-ac06-88dd60e224fe-kube-api-access-wdmvm\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.502609 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12baa943-6113-449f-ac06-88dd60e224fe-kolla-config\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.502798 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12baa943-6113-449f-ac06-88dd60e224fe-config-data\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.506612 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/12baa943-6113-449f-ac06-88dd60e224fe-memcached-tls-certs\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.507673 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12baa943-6113-449f-ac06-88dd60e224fe-combined-ca-bundle\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.524926 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdmvm\" (UniqueName: \"kubernetes.io/projected/12baa943-6113-449f-ac06-88dd60e224fe-kube-api-access-wdmvm\") pod \"memcached-0\" (UID: \"12baa943-6113-449f-ac06-88dd60e224fe\") " pod="openstack/memcached-0" Dec 04 15:37:11 crc kubenswrapper[4676]: I1204 15:37:11.629390 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.077001 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.078535 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.081025 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2xlpk" Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.113808 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.240748 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjkf\" (UniqueName: \"kubernetes.io/projected/ea978af1-b6d8-490b-8bfd-6b2ec699f47f-kube-api-access-wrjkf\") pod \"kube-state-metrics-0\" (UID: \"ea978af1-b6d8-490b-8bfd-6b2ec699f47f\") " pod="openstack/kube-state-metrics-0" Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.341895 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjkf\" (UniqueName: \"kubernetes.io/projected/ea978af1-b6d8-490b-8bfd-6b2ec699f47f-kube-api-access-wrjkf\") pod \"kube-state-metrics-0\" (UID: \"ea978af1-b6d8-490b-8bfd-6b2ec699f47f\") " pod="openstack/kube-state-metrics-0" Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.371224 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjkf\" (UniqueName: \"kubernetes.io/projected/ea978af1-b6d8-490b-8bfd-6b2ec699f47f-kube-api-access-wrjkf\") pod \"kube-state-metrics-0\" (UID: \"ea978af1-b6d8-490b-8bfd-6b2ec699f47f\") " pod="openstack/kube-state-metrics-0" Dec 04 15:37:13 crc kubenswrapper[4676]: I1204 15:37:13.419484 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.285156 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.287976 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.292816 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.293087 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-dsbwb" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.293139 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.293403 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.293635 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.300099 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.303305 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.405837 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-config\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.406171 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.406358 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.406607 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.406706 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83d9914-203c-4a22-a92f-80851859fd48-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.406762 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.406787 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llzrm\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-kube-api-access-llzrm\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.407088 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83d9914-203c-4a22-a92f-80851859fd48-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.508801 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-config\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.508883 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.508938 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.508971 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.508998 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83d9914-203c-4a22-a92f-80851859fd48-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.518014 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.606537 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-config\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.510110 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.607504 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.608280 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llzrm\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-kube-api-access-llzrm\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.608372 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83d9914-203c-4a22-a92f-80851859fd48-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.608556 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83d9914-203c-4a22-a92f-80851859fd48-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.609514 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83d9914-203c-4a22-a92f-80851859fd48-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.614365 4676 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.614579 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20a8147025daa03f462937d002ea44fbf472037636c1db1460079ca29c39445e/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.614757 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.630722 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llzrm\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-kube-api-access-llzrm\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.676034 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:14 crc kubenswrapper[4676]: I1204 15:37:14.911677 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.840426 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.842151 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.846925 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9n8gk" Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.846899 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.846960 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.847078 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.847399 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 04 15:37:16 crc kubenswrapper[4676]: I1204 15:37:16.855797 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.005825 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.005920 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpncj\" (UniqueName: \"kubernetes.io/projected/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-kube-api-access-wpncj\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.005980 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.006007 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.006033 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-config\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.006064 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.006084 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.006117 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.107710 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.108573 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpncj\" (UniqueName: \"kubernetes.io/projected/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-kube-api-access-wpncj\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.108683 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.108722 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.108761 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-config\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.108808 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.108834 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.108882 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.110133 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-config\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.110441 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.110505 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.110765 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.114384 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.115710 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.117180 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.131577 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpncj\" (UniqueName: \"kubernetes.io/projected/3e3a586c-5d43-4f0a-9f77-038f2a5a0880-kube-api-access-wpncj\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.138510 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"3e3a586c-5d43-4f0a-9f77-038f2a5a0880\") " pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.166691 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.732696 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hdtnf"] Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.734117 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.738084 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-d9n82" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.738337 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.738603 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.743832 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hdtnf"] Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.789779 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8r4vm"] Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.791674 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.809128 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8r4vm"] Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929117 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4726f661-5133-4c0f-8f63-5a93481ed0df-scripts\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929185 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce63098e-8737-4061-94ce-2b8c76ccb26f-ovn-controller-tls-certs\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929221 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-log\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929442 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkjcf\" (UniqueName: \"kubernetes.io/projected/ce63098e-8737-4061-94ce-2b8c76ccb26f-kube-api-access-gkjcf\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929609 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-run\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929750 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-run-ovn\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929924 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce63098e-8737-4061-94ce-2b8c76ccb26f-scripts\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.929979 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-lib\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.930043 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-run\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.930149 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce63098e-8737-4061-94ce-2b8c76ccb26f-combined-ca-bundle\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.930206 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjcfg\" (UniqueName: \"kubernetes.io/projected/4726f661-5133-4c0f-8f63-5a93481ed0df-kube-api-access-gjcfg\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.930297 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-log-ovn\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:17 crc kubenswrapper[4676]: I1204 15:37:17.930396 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-etc-ovs\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031448 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce63098e-8737-4061-94ce-2b8c76ccb26f-scripts\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031506 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-lib\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031534 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-run\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031563 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce63098e-8737-4061-94ce-2b8c76ccb26f-combined-ca-bundle\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031587 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjcfg\" (UniqueName: \"kubernetes.io/projected/4726f661-5133-4c0f-8f63-5a93481ed0df-kube-api-access-gjcfg\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031619 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-log-ovn\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031657 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-etc-ovs\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031687 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4726f661-5133-4c0f-8f63-5a93481ed0df-scripts\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031711 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce63098e-8737-4061-94ce-2b8c76ccb26f-ovn-controller-tls-certs\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031735 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-log\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031762 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkjcf\" (UniqueName: \"kubernetes.io/projected/ce63098e-8737-4061-94ce-2b8c76ccb26f-kube-api-access-gkjcf\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031784 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-run\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.031809 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-run-ovn\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.032084 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-run\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.032157 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-run-ovn\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.032238 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-lib\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.032441 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-run\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.032597 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ce63098e-8737-4061-94ce-2b8c76ccb26f-var-log-ovn\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.032620 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-var-log\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.032598 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/4726f661-5133-4c0f-8f63-5a93481ed0df-etc-ovs\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.034112 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4726f661-5133-4c0f-8f63-5a93481ed0df-scripts\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.036098 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce63098e-8737-4061-94ce-2b8c76ccb26f-ovn-controller-tls-certs\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.036163 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce63098e-8737-4061-94ce-2b8c76ccb26f-scripts\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.044528 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce63098e-8737-4061-94ce-2b8c76ccb26f-combined-ca-bundle\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.058686 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkjcf\" (UniqueName: \"kubernetes.io/projected/ce63098e-8737-4061-94ce-2b8c76ccb26f-kube-api-access-gkjcf\") pod \"ovn-controller-hdtnf\" (UID: \"ce63098e-8737-4061-94ce-2b8c76ccb26f\") " pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.058888 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjcfg\" (UniqueName: \"kubernetes.io/projected/4726f661-5133-4c0f-8f63-5a93481ed0df-kube-api-access-gjcfg\") pod \"ovn-controller-ovs-8r4vm\" (UID: \"4726f661-5133-4c0f-8f63-5a93481ed0df\") " pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.111458 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:18 crc kubenswrapper[4676]: I1204 15:37:18.358933 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:20 crc kubenswrapper[4676]: I1204 15:37:20.974456 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 15:37:20 crc kubenswrapper[4676]: I1204 15:37:20.977054 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:20 crc kubenswrapper[4676]: I1204 15:37:20.980491 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-g7x7v" Dec 04 15:37:20 crc kubenswrapper[4676]: I1204 15:37:20.982080 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 04 15:37:20 crc kubenswrapper[4676]: I1204 15:37:20.982107 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 04 15:37:20 crc kubenswrapper[4676]: I1204 15:37:20.982734 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 04 15:37:20 crc kubenswrapper[4676]: I1204 15:37:20.987863 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.156828 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4gl5\" (UniqueName: \"kubernetes.io/projected/163c3f92-f9e6-43bb-8958-c3715f2dae4a-kube-api-access-s4gl5\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.156889 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/163c3f92-f9e6-43bb-8958-c3715f2dae4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.156954 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/163c3f92-f9e6-43bb-8958-c3715f2dae4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.156986 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163c3f92-f9e6-43bb-8958-c3715f2dae4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.157028 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.157085 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.157116 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.157143 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.258895 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4gl5\" (UniqueName: \"kubernetes.io/projected/163c3f92-f9e6-43bb-8958-c3715f2dae4a-kube-api-access-s4gl5\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259042 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/163c3f92-f9e6-43bb-8958-c3715f2dae4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259112 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/163c3f92-f9e6-43bb-8958-c3715f2dae4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259147 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163c3f92-f9e6-43bb-8958-c3715f2dae4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259173 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259220 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259241 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259269 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.259693 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.260542 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/163c3f92-f9e6-43bb-8958-c3715f2dae4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.261198 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/163c3f92-f9e6-43bb-8958-c3715f2dae4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.261296 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/163c3f92-f9e6-43bb-8958-c3715f2dae4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.266297 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.266515 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.277233 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/163c3f92-f9e6-43bb-8958-c3715f2dae4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.279691 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4gl5\" (UniqueName: \"kubernetes.io/projected/163c3f92-f9e6-43bb-8958-c3715f2dae4a-kube-api-access-s4gl5\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.294890 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"163c3f92-f9e6-43bb-8958-c3715f2dae4a\") " pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:21 crc kubenswrapper[4676]: I1204 15:37:21.299387 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.056135 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.056240 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.056412 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.129.56.200:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nq79w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6cb85897d5-zcmk6_openstack(08feaa9c-6136-4163-8c17-c123473d4aef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.057559 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" podUID="08feaa9c-6136-4163-8c17-c123473d4aef" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.090819 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.090882 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.091035 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.129.56.200:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tdrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54879cc849-jgszv_openstack(16c9a348-9198-411c-960d-182d97d8d5f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:37:22 crc kubenswrapper[4676]: E1204 15:37:22.093319 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54879cc849-jgszv" podUID="16c9a348-9198-411c-960d-182d97d8d5f3" Dec 04 15:37:22 crc kubenswrapper[4676]: I1204 15:37:22.974440 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.424463 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54879cc849-jgszv" event={"ID":"16c9a348-9198-411c-960d-182d97d8d5f3","Type":"ContainerDied","Data":"e7d5d0c439c32f6f054a1617aaa42d882dc3fb8c9d8311a37ee54dbab58baed9"} Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.424523 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7d5d0c439c32f6f054a1617aaa42d882dc3fb8c9d8311a37ee54dbab58baed9" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.425430 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a074e2a9-e6e9-488d-8338-54231ab8faf9","Type":"ContainerStarted","Data":"1edcf76fa8106c62790402a8ed82ef16ab65c192d38aba4bd924b6b72aff4e4c"} Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.426875 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" event={"ID":"08feaa9c-6136-4163-8c17-c123473d4aef","Type":"ContainerDied","Data":"edc9d17095515eeb1fdb1af66f3c6e4f46169fb8ed2c1d7806e0449af8821065"} Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.426942 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc9d17095515eeb1fdb1af66f3c6e4f46169fb8ed2c1d7806e0449af8821065" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.451476 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.457830 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.627634 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-config\") pod \"08feaa9c-6136-4163-8c17-c123473d4aef\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.627729 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq79w\" (UniqueName: \"kubernetes.io/projected/08feaa9c-6136-4163-8c17-c123473d4aef-kube-api-access-nq79w\") pod \"08feaa9c-6136-4163-8c17-c123473d4aef\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.627764 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c9a348-9198-411c-960d-182d97d8d5f3-config\") pod \"16c9a348-9198-411c-960d-182d97d8d5f3\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.627806 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tdrq\" (UniqueName: \"kubernetes.io/projected/16c9a348-9198-411c-960d-182d97d8d5f3-kube-api-access-2tdrq\") pod \"16c9a348-9198-411c-960d-182d97d8d5f3\" (UID: \"16c9a348-9198-411c-960d-182d97d8d5f3\") " Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.627852 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-dns-svc\") pod \"08feaa9c-6136-4163-8c17-c123473d4aef\" (UID: \"08feaa9c-6136-4163-8c17-c123473d4aef\") " Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.628341 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16c9a348-9198-411c-960d-182d97d8d5f3-config" (OuterVolumeSpecName: "config") pod "16c9a348-9198-411c-960d-182d97d8d5f3" (UID: "16c9a348-9198-411c-960d-182d97d8d5f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.628391 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-config" (OuterVolumeSpecName: "config") pod "08feaa9c-6136-4163-8c17-c123473d4aef" (UID: "08feaa9c-6136-4163-8c17-c123473d4aef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.628681 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.628701 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16c9a348-9198-411c-960d-182d97d8d5f3-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.628962 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08feaa9c-6136-4163-8c17-c123473d4aef" (UID: "08feaa9c-6136-4163-8c17-c123473d4aef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.638917 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08feaa9c-6136-4163-8c17-c123473d4aef-kube-api-access-nq79w" (OuterVolumeSpecName: "kube-api-access-nq79w") pod "08feaa9c-6136-4163-8c17-c123473d4aef" (UID: "08feaa9c-6136-4163-8c17-c123473d4aef"). InnerVolumeSpecName "kube-api-access-nq79w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.643063 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16c9a348-9198-411c-960d-182d97d8d5f3-kube-api-access-2tdrq" (OuterVolumeSpecName: "kube-api-access-2tdrq") pod "16c9a348-9198-411c-960d-182d97d8d5f3" (UID: "16c9a348-9198-411c-960d-182d97d8d5f3"). InnerVolumeSpecName "kube-api-access-2tdrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.731095 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq79w\" (UniqueName: \"kubernetes.io/projected/08feaa9c-6136-4163-8c17-c123473d4aef-kube-api-access-nq79w\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.731144 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tdrq\" (UniqueName: \"kubernetes.io/projected/16c9a348-9198-411c-960d-182d97d8d5f3-kube-api-access-2tdrq\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.731158 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08feaa9c-6136-4163-8c17-c123473d4aef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.765144 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:37:23 crc kubenswrapper[4676]: W1204 15:37:23.787169 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b4fb4d_9b61_414f_a78c_71a143c965d2.slice/crio-d3d25acdb293ed95e19573a8558a973c4cd95aae3e7c2c4d3a906608934c6865 WatchSource:0}: Error finding container d3d25acdb293ed95e19573a8558a973c4cd95aae3e7c2c4d3a906608934c6865: Status 404 returned error can't find the container with id d3d25acdb293ed95e19573a8558a973c4cd95aae3e7c2c4d3a906608934c6865 Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.791097 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.819927 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c858cc7bf-2k42f"] Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.830948 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hdtnf"] Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.851527 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 04 15:37:23 crc kubenswrapper[4676]: W1204 15:37:23.859184 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc52ad2e5_0a77_4894_8535_30b4e98cdda9.slice/crio-395164516eafee757645a469eef2799502f07bbb5b737ce4da1be3e270b01524 WatchSource:0}: Error finding container 395164516eafee757645a469eef2799502f07bbb5b737ce4da1be3e270b01524: Status 404 returned error can't find the container with id 395164516eafee757645a469eef2799502f07bbb5b737ce4da1be3e270b01524 Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.864544 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b7696bc7-6t68r"] Dec 04 15:37:23 crc kubenswrapper[4676]: I1204 15:37:23.875563 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 04 15:37:23 crc kubenswrapper[4676]: W1204 15:37:23.897627 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3a586c_5d43_4f0a_9f77_038f2a5a0880.slice/crio-959160804a24d5e9589e399205e03df9bbbfe9c64e8a9b3affe9225be46f68b4 WatchSource:0}: Error finding container 959160804a24d5e9589e399205e03df9bbbfe9c64e8a9b3affe9225be46f68b4: Status 404 returned error can't find the container with id 959160804a24d5e9589e399205e03df9bbbfe9c64e8a9b3affe9225be46f68b4 Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.104994 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.113675 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:37:24 crc kubenswrapper[4676]: W1204 15:37:24.130368 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3588a213_92d7_43d7_8a28_6a9104f1d48e.slice/crio-dd883650c8bb3f5f94b828b7430e1fe2e8d68d4170a2de1a32ffbd6924d574ac WatchSource:0}: Error finding container dd883650c8bb3f5f94b828b7430e1fe2e8d68d4170a2de1a32ffbd6924d574ac: Status 404 returned error can't find the container with id dd883650c8bb3f5f94b828b7430e1fe2e8d68d4170a2de1a32ffbd6924d574ac Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.136403 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb647867c-7vc6x"] Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.149587 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:37:24 crc kubenswrapper[4676]: W1204 15:37:24.154634 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09d23694_3775_496d_ba9a_888abb40ea10.slice/crio-70c2ab7eb7166f583eb737038ce43ad8486145f6729b3314daf9324edf14594a WatchSource:0}: Error finding container 70c2ab7eb7166f583eb737038ce43ad8486145f6729b3314daf9324edf14594a: Status 404 returned error can't find the container with id 70c2ab7eb7166f583eb737038ce43ad8486145f6729b3314daf9324edf14594a Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.161831 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 04 15:37:24 crc kubenswrapper[4676]: W1204 15:37:24.168804 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12baa943_6113_449f_ac06_88dd60e224fe.slice/crio-fe217e0be960f71310d9d1413d20301b94b3469c70da618d68a23c541dc5fe27 WatchSource:0}: Error finding container fe217e0be960f71310d9d1413d20301b94b3469c70da618d68a23c541dc5fe27: Status 404 returned error can't find the container with id fe217e0be960f71310d9d1413d20301b94b3469c70da618d68a23c541dc5fe27 Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.193181 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 04 15:37:24 crc kubenswrapper[4676]: W1204 15:37:24.199468 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod163c3f92_f9e6_43bb_8958_c3715f2dae4a.slice/crio-50717c99db70b64055a64bc86fead7d815fd2845f49121568334a50d0ffa913e WatchSource:0}: Error finding container 50717c99db70b64055a64bc86fead7d815fd2845f49121568334a50d0ffa913e: Status 404 returned error can't find the container with id 50717c99db70b64055a64bc86fead7d815fd2845f49121568334a50d0ffa913e Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.448599 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3e3a586c-5d43-4f0a-9f77-038f2a5a0880","Type":"ContainerStarted","Data":"959160804a24d5e9589e399205e03df9bbbfe9c64e8a9b3affe9225be46f68b4"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.450369 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"163c3f92-f9e6-43bb-8958-c3715f2dae4a","Type":"ContainerStarted","Data":"50717c99db70b64055a64bc86fead7d815fd2845f49121568334a50d0ffa913e"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.453454 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"743292d4-f5a5-48cd-bcb0-63fb95ac6910","Type":"ContainerStarted","Data":"487f160e9670c758fa5fa69d1fdc5eb7d441fde0f1d868194152c72d5169f7cf"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.455353 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ea978af1-b6d8-490b-8bfd-6b2ec699f47f","Type":"ContainerStarted","Data":"2042bee0f9f93f4842370f0136ee9d3ff847165421fdf33cfc079a1e57d94e44"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.457630 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c52ad2e5-0a77-4894-8535-30b4e98cdda9","Type":"ContainerStarted","Data":"395164516eafee757645a469eef2799502f07bbb5b737ce4da1be3e270b01524"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.459690 4676 generic.go:334] "Generic (PLEG): container finished" podID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerID="e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd" exitCode=0 Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.459751 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" event={"ID":"e6b4fb4d-9b61-414f-a78c-71a143c965d2","Type":"ContainerDied","Data":"e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.459797 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" event={"ID":"e6b4fb4d-9b61-414f-a78c-71a143c965d2","Type":"ContainerStarted","Data":"d3d25acdb293ed95e19573a8558a973c4cd95aae3e7c2c4d3a906608934c6865"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.463413 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bfec4df-7119-489c-a2e8-17dddd0e5c1d","Type":"ContainerStarted","Data":"eba9024ff6b212171ba475bacce568c31c34c5f7f0101258262a4e0fc6b4fb76"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.466000 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"12baa943-6113-449f-ac06-88dd60e224fe","Type":"ContainerStarted","Data":"fe217e0be960f71310d9d1413d20301b94b3469c70da618d68a23c541dc5fe27"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.470184 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3588a213-92d7-43d7-8a28-6a9104f1d48e","Type":"ContainerStarted","Data":"dd883650c8bb3f5f94b828b7430e1fe2e8d68d4170a2de1a32ffbd6924d574ac"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.473778 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerStarted","Data":"1658181f55868b57375875aef87b050de768632265a14fd0aee01662549f7375"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.475863 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" event={"ID":"09d23694-3775-496d-ba9a-888abb40ea10","Type":"ContainerStarted","Data":"70c2ab7eb7166f583eb737038ce43ad8486145f6729b3314daf9324edf14594a"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.492566 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hdtnf" event={"ID":"ce63098e-8737-4061-94ce-2b8c76ccb26f","Type":"ContainerStarted","Data":"5136a3321b09ddc20c906b8ede3735391c05e8d8ddf5805be22a62583ed57ad4"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.499386 4676 generic.go:334] "Generic (PLEG): container finished" podID="485b242f-88d0-4521-a25c-e9a957a58e19" containerID="655f4974c5d474c3ce92d089cfdd7cc1363536c57807541d0a3429c5aa031a56" exitCode=0 Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.499484 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54879cc849-jgszv" Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.509996 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" event={"ID":"485b242f-88d0-4521-a25c-e9a957a58e19","Type":"ContainerDied","Data":"655f4974c5d474c3ce92d089cfdd7cc1363536c57807541d0a3429c5aa031a56"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.510046 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" event={"ID":"485b242f-88d0-4521-a25c-e9a957a58e19","Type":"ContainerStarted","Data":"188fbfc406ca281b17eefc85ca916ee21694f76f209653d75608a0f751c2a49c"} Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.516615 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb85897d5-zcmk6" Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.780166 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb85897d5-zcmk6"] Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.799698 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb85897d5-zcmk6"] Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.821565 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54879cc849-jgszv"] Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.829215 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54879cc849-jgszv"] Dec 04 15:37:24 crc kubenswrapper[4676]: E1204 15:37:24.833130 4676 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 04 15:37:24 crc kubenswrapper[4676]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e6b4fb4d-9b61-414f-a78c-71a143c965d2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 15:37:24 crc kubenswrapper[4676]: > podSandboxID="d3d25acdb293ed95e19573a8558a973c4cd95aae3e7c2c4d3a906608934c6865" Dec 04 15:37:24 crc kubenswrapper[4676]: E1204 15:37:24.833396 4676 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 04 15:37:24 crc kubenswrapper[4676]: container &Container{Name:dnsmasq-dns,Image:38.129.56.200:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4z54w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c858cc7bf-2k42f_openstack(e6b4fb4d-9b61-414f-a78c-71a143c965d2): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e6b4fb4d-9b61-414f-a78c-71a143c965d2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 04 15:37:24 crc kubenswrapper[4676]: > logger="UnhandledError" Dec 04 15:37:24 crc kubenswrapper[4676]: E1204 15:37:24.834679 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e6b4fb4d-9b61-414f-a78c-71a143c965d2/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" Dec 04 15:37:24 crc kubenswrapper[4676]: I1204 15:37:24.861669 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8r4vm"] Dec 04 15:37:25 crc kubenswrapper[4676]: I1204 15:37:25.402357 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08feaa9c-6136-4163-8c17-c123473d4aef" path="/var/lib/kubelet/pods/08feaa9c-6136-4163-8c17-c123473d4aef/volumes" Dec 04 15:37:25 crc kubenswrapper[4676]: I1204 15:37:25.403279 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16c9a348-9198-411c-960d-182d97d8d5f3" path="/var/lib/kubelet/pods/16c9a348-9198-411c-960d-182d97d8d5f3/volumes" Dec 04 15:37:25 crc kubenswrapper[4676]: I1204 15:37:25.514822 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8r4vm" event={"ID":"4726f661-5133-4c0f-8f63-5a93481ed0df","Type":"ContainerStarted","Data":"f4d173db120ad2c97586848481d2c1dfe23c9dfda80cf2bebe983a4f40dab416"} Dec 04 15:37:29 crc kubenswrapper[4676]: E1204 15:37:29.002471 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09d23694_3775_496d_ba9a_888abb40ea10.slice/crio-conmon-4758e6cb4b36740bd71a11c07983f3778ee84f9f3452e1759eac1e972ac1c9f1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09d23694_3775_496d_ba9a_888abb40ea10.slice/crio-4758e6cb4b36740bd71a11c07983f3778ee84f9f3452e1759eac1e972ac1c9f1.scope\": RecentStats: unable to find data in memory cache]" Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.550136 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" event={"ID":"485b242f-88d0-4521-a25c-e9a957a58e19","Type":"ContainerStarted","Data":"4be8f5776f5b5a8419db302f71216b959f822b3d4354c9ee10f9042623971077"} Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.550479 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.552173 4676 generic.go:334] "Generic (PLEG): container finished" podID="09d23694-3775-496d-ba9a-888abb40ea10" containerID="4758e6cb4b36740bd71a11c07983f3778ee84f9f3452e1759eac1e972ac1c9f1" exitCode=0 Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.552269 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" event={"ID":"09d23694-3775-496d-ba9a-888abb40ea10","Type":"ContainerDied","Data":"4758e6cb4b36740bd71a11c07983f3778ee84f9f3452e1759eac1e972ac1c9f1"} Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.555020 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" event={"ID":"e6b4fb4d-9b61-414f-a78c-71a143c965d2","Type":"ContainerStarted","Data":"fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a"} Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.555203 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.578132 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" podStartSLOduration=23.463252451 podStartE2EDuration="23.578095218s" podCreationTimestamp="2025-12-04 15:37:06 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.87885432 +0000 UTC m=+1051.313524177" lastFinishedPulling="2025-12-04 15:37:23.993697087 +0000 UTC m=+1051.428366944" observedRunningTime="2025-12-04 15:37:29.569658533 +0000 UTC m=+1057.004328400" watchObservedRunningTime="2025-12-04 15:37:29.578095218 +0000 UTC m=+1057.012765075" Dec 04 15:37:29 crc kubenswrapper[4676]: I1204 15:37:29.593551 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" podStartSLOduration=23.444042514 podStartE2EDuration="23.593531677s" podCreationTimestamp="2025-12-04 15:37:06 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.7983166 +0000 UTC m=+1051.232986457" lastFinishedPulling="2025-12-04 15:37:23.947805763 +0000 UTC m=+1051.382475620" observedRunningTime="2025-12-04 15:37:29.587648196 +0000 UTC m=+1057.022318073" watchObservedRunningTime="2025-12-04 15:37:29.593531677 +0000 UTC m=+1057.028201534" Dec 04 15:37:30 crc kubenswrapper[4676]: I1204 15:37:30.821892 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.002786 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67kb9\" (UniqueName: \"kubernetes.io/projected/09d23694-3775-496d-ba9a-888abb40ea10-kube-api-access-67kb9\") pod \"09d23694-3775-496d-ba9a-888abb40ea10\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.002830 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-config\") pod \"09d23694-3775-496d-ba9a-888abb40ea10\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.002917 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-dns-svc\") pod \"09d23694-3775-496d-ba9a-888abb40ea10\" (UID: \"09d23694-3775-496d-ba9a-888abb40ea10\") " Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.021763 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d23694-3775-496d-ba9a-888abb40ea10-kube-api-access-67kb9" (OuterVolumeSpecName: "kube-api-access-67kb9") pod "09d23694-3775-496d-ba9a-888abb40ea10" (UID: "09d23694-3775-496d-ba9a-888abb40ea10"). InnerVolumeSpecName "kube-api-access-67kb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.023994 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-config" (OuterVolumeSpecName: "config") pod "09d23694-3775-496d-ba9a-888abb40ea10" (UID: "09d23694-3775-496d-ba9a-888abb40ea10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.029640 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09d23694-3775-496d-ba9a-888abb40ea10" (UID: "09d23694-3775-496d-ba9a-888abb40ea10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.104994 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.105041 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67kb9\" (UniqueName: \"kubernetes.io/projected/09d23694-3775-496d-ba9a-888abb40ea10-kube-api-access-67kb9\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.105055 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09d23694-3775-496d-ba9a-888abb40ea10-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.573263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" event={"ID":"09d23694-3775-496d-ba9a-888abb40ea10","Type":"ContainerDied","Data":"70c2ab7eb7166f583eb737038ce43ad8486145f6729b3314daf9324edf14594a"} Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.573349 4676 scope.go:117] "RemoveContainer" containerID="4758e6cb4b36740bd71a11c07983f3778ee84f9f3452e1759eac1e972ac1c9f1" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.573502 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb647867c-7vc6x" Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.620966 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb647867c-7vc6x"] Dec 04 15:37:31 crc kubenswrapper[4676]: I1204 15:37:31.627858 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb647867c-7vc6x"] Dec 04 15:37:33 crc kubenswrapper[4676]: I1204 15:37:33.398480 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d23694-3775-496d-ba9a-888abb40ea10" path="/var/lib/kubelet/pods/09d23694-3775-496d-ba9a-888abb40ea10/volumes" Dec 04 15:37:35 crc kubenswrapper[4676]: I1204 15:37:35.624640 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c52ad2e5-0a77-4894-8535-30b4e98cdda9","Type":"ContainerStarted","Data":"604fb237cae8c61a13084c33e2c787e4aa42a3a6906a4174ea404c017cf954e5"} Dec 04 15:37:36 crc kubenswrapper[4676]: I1204 15:37:36.635303 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3e3a586c-5d43-4f0a-9f77-038f2a5a0880","Type":"ContainerStarted","Data":"1f281219202a285315a9114f945770561ea6e87d9ffd1d2f95d67a854b21e22d"} Dec 04 15:37:36 crc kubenswrapper[4676]: I1204 15:37:36.637510 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"12baa943-6113-449f-ac06-88dd60e224fe","Type":"ContainerStarted","Data":"3ae6a27c6fff775bf8c8e1cb1de87a06665faaa20e39aa9743250742295b7483"} Dec 04 15:37:36 crc kubenswrapper[4676]: I1204 15:37:36.637623 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 04 15:37:36 crc kubenswrapper[4676]: I1204 15:37:36.639096 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3588a213-92d7-43d7-8a28-6a9104f1d48e","Type":"ContainerStarted","Data":"0f58bd77c81dabe6a9838b0f54fa6f143a7b76405935823f03d620ee6ea2aed2"} Dec 04 15:37:36 crc kubenswrapper[4676]: I1204 15:37:36.657700 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.723347355 podStartE2EDuration="25.657676772s" podCreationTimestamp="2025-12-04 15:37:11 +0000 UTC" firstStartedPulling="2025-12-04 15:37:24.18002452 +0000 UTC m=+1051.614694377" lastFinishedPulling="2025-12-04 15:37:34.114353937 +0000 UTC m=+1061.549023794" observedRunningTime="2025-12-04 15:37:36.656334203 +0000 UTC m=+1064.091004060" watchObservedRunningTime="2025-12-04 15:37:36.657676772 +0000 UTC m=+1064.092346639" Dec 04 15:37:36 crc kubenswrapper[4676]: I1204 15:37:36.843151 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.162377 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.242775 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c858cc7bf-2k42f"] Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.651850 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hdtnf" event={"ID":"ce63098e-8737-4061-94ce-2b8c76ccb26f","Type":"ContainerStarted","Data":"0c10b61e6b0cffe9218b7d8d5552a260e0c055e15cd70be6eb03135f6f78af8a"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.651971 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-hdtnf" Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.655208 4676 generic.go:334] "Generic (PLEG): container finished" podID="4726f661-5133-4c0f-8f63-5a93481ed0df" containerID="75fe65b941e5594c0ee1018233c2fb7289847078c3d8210892fa0e645b8e765c" exitCode=0 Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.655288 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8r4vm" event={"ID":"4726f661-5133-4c0f-8f63-5a93481ed0df","Type":"ContainerDied","Data":"75fe65b941e5594c0ee1018233c2fb7289847078c3d8210892fa0e645b8e765c"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.658207 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a074e2a9-e6e9-488d-8338-54231ab8faf9","Type":"ContainerStarted","Data":"aab1f4365096fc9f95b98fb41f7d714cf599ff2efa4ba3bf021e19be29151223"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.663253 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bfec4df-7119-489c-a2e8-17dddd0e5c1d","Type":"ContainerStarted","Data":"1e79cadee4110746d5dcc8072fd80203a89b940c26619c6972fe68e00666b3ab"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.666271 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"163c3f92-f9e6-43bb-8958-c3715f2dae4a","Type":"ContainerStarted","Data":"1a5ad6b069a886733834e3a39c2b56ca16713c96ac834c2f1a48a070f1127f64"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.669847 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"743292d4-f5a5-48cd-bcb0-63fb95ac6910","Type":"ContainerStarted","Data":"a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.678840 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-hdtnf" podStartSLOduration=10.379341595 podStartE2EDuration="20.678817561s" podCreationTimestamp="2025-12-04 15:37:17 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.814398057 +0000 UTC m=+1051.249067914" lastFinishedPulling="2025-12-04 15:37:34.113874023 +0000 UTC m=+1061.548543880" observedRunningTime="2025-12-04 15:37:37.677559124 +0000 UTC m=+1065.112229001" watchObservedRunningTime="2025-12-04 15:37:37.678817561 +0000 UTC m=+1065.113487418" Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.678869 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ea978af1-b6d8-490b-8bfd-6b2ec699f47f","Type":"ContainerStarted","Data":"1d295816b0c149b0ed9563d54bf078c7551d06e7304c2510bbb73404b846decd"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.679712 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.685036 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerStarted","Data":"7dcb20f95ba2c36d461fad9df709170e9632819ba97a81bff20c81a9af750c0b"} Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.685412 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerName="dnsmasq-dns" containerID="cri-o://fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a" gracePeriod=10 Dec 04 15:37:37 crc kubenswrapper[4676]: I1204 15:37:37.816878 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.561273607 podStartE2EDuration="24.816856192s" podCreationTimestamp="2025-12-04 15:37:13 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.778467423 +0000 UTC m=+1051.213137280" lastFinishedPulling="2025-12-04 15:37:35.034050008 +0000 UTC m=+1062.468719865" observedRunningTime="2025-12-04 15:37:37.81610805 +0000 UTC m=+1065.250777917" watchObservedRunningTime="2025-12-04 15:37:37.816856192 +0000 UTC m=+1065.251526049" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.648159 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.695092 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8r4vm" event={"ID":"4726f661-5133-4c0f-8f63-5a93481ed0df","Type":"ContainerStarted","Data":"7ea32de5c64c07056c2e5d59085daf38a549c5c65825417a517329545ac61d7a"} Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.696939 4676 generic.go:334] "Generic (PLEG): container finished" podID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerID="fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a" exitCode=0 Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.697021 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.697062 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" event={"ID":"e6b4fb4d-9b61-414f-a78c-71a143c965d2","Type":"ContainerDied","Data":"fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a"} Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.697157 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c858cc7bf-2k42f" event={"ID":"e6b4fb4d-9b61-414f-a78c-71a143c965d2","Type":"ContainerDied","Data":"d3d25acdb293ed95e19573a8558a973c4cd95aae3e7c2c4d3a906608934c6865"} Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.697188 4676 scope.go:117] "RemoveContainer" containerID="fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.721275 4676 scope.go:117] "RemoveContainer" containerID="e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.737512 4676 scope.go:117] "RemoveContainer" containerID="fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a" Dec 04 15:37:38 crc kubenswrapper[4676]: E1204 15:37:38.738056 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a\": container with ID starting with fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a not found: ID does not exist" containerID="fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.738105 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a"} err="failed to get container status \"fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a\": rpc error: code = NotFound desc = could not find container \"fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a\": container with ID starting with fe1a83bdbb31d4cee90d1aea78ba225b7c9feebc50a3e0341dd61e2f6693947a not found: ID does not exist" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.738134 4676 scope.go:117] "RemoveContainer" containerID="e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd" Dec 04 15:37:38 crc kubenswrapper[4676]: E1204 15:37:38.738497 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd\": container with ID starting with e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd not found: ID does not exist" containerID="e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.738533 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd"} err="failed to get container status \"e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd\": rpc error: code = NotFound desc = could not find container \"e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd\": container with ID starting with e4d7c8e8f774a5f160a6671930f6f778070f9f09f6ab3c9b05837b4d9c5ce7dd not found: ID does not exist" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.754314 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z54w\" (UniqueName: \"kubernetes.io/projected/e6b4fb4d-9b61-414f-a78c-71a143c965d2-kube-api-access-4z54w\") pod \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.754513 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-config\") pod \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.754567 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-dns-svc\") pod \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\" (UID: \"e6b4fb4d-9b61-414f-a78c-71a143c965d2\") " Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.761533 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b4fb4d-9b61-414f-a78c-71a143c965d2-kube-api-access-4z54w" (OuterVolumeSpecName: "kube-api-access-4z54w") pod "e6b4fb4d-9b61-414f-a78c-71a143c965d2" (UID: "e6b4fb4d-9b61-414f-a78c-71a143c965d2"). InnerVolumeSpecName "kube-api-access-4z54w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.798426 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-config" (OuterVolumeSpecName: "config") pod "e6b4fb4d-9b61-414f-a78c-71a143c965d2" (UID: "e6b4fb4d-9b61-414f-a78c-71a143c965d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.800538 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6b4fb4d-9b61-414f-a78c-71a143c965d2" (UID: "e6b4fb4d-9b61-414f-a78c-71a143c965d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.857594 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z54w\" (UniqueName: \"kubernetes.io/projected/e6b4fb4d-9b61-414f-a78c-71a143c965d2-kube-api-access-4z54w\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.857631 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:38 crc kubenswrapper[4676]: I1204 15:37:38.857645 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6b4fb4d-9b61-414f-a78c-71a143c965d2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:39 crc kubenswrapper[4676]: I1204 15:37:39.043060 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c858cc7bf-2k42f"] Dec 04 15:37:39 crc kubenswrapper[4676]: I1204 15:37:39.052443 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c858cc7bf-2k42f"] Dec 04 15:37:39 crc kubenswrapper[4676]: I1204 15:37:39.395659 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" path="/var/lib/kubelet/pods/e6b4fb4d-9b61-414f-a78c-71a143c965d2/volumes" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.716790 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"163c3f92-f9e6-43bb-8958-c3715f2dae4a","Type":"ContainerStarted","Data":"32ecde97ae0114ce7fb0eff35a70da19964a22acf8af94bd9459156b40686c6e"} Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.719231 4676 generic.go:334] "Generic (PLEG): container finished" podID="c52ad2e5-0a77-4894-8535-30b4e98cdda9" containerID="604fb237cae8c61a13084c33e2c787e4aa42a3a6906a4174ea404c017cf954e5" exitCode=0 Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.719296 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c52ad2e5-0a77-4894-8535-30b4e98cdda9","Type":"ContainerDied","Data":"604fb237cae8c61a13084c33e2c787e4aa42a3a6906a4174ea404c017cf954e5"} Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.723883 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8r4vm" event={"ID":"4726f661-5133-4c0f-8f63-5a93481ed0df","Type":"ContainerStarted","Data":"06da3f505f4795554a9746b8428c6e4b3c55e5edf0ff7ec6ee26154ae1e58a8e"} Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.724036 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.725686 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3e3a586c-5d43-4f0a-9f77-038f2a5a0880","Type":"ContainerStarted","Data":"e100d34145babc46bd5455efcd599cc1c5c56015a95e0945299d31bc074a7d0e"} Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.754814 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.601078994 podStartE2EDuration="21.753858914s" podCreationTimestamp="2025-12-04 15:37:19 +0000 UTC" firstStartedPulling="2025-12-04 15:37:24.202612017 +0000 UTC m=+1051.637281874" lastFinishedPulling="2025-12-04 15:37:40.355391937 +0000 UTC m=+1067.790061794" observedRunningTime="2025-12-04 15:37:40.7378692 +0000 UTC m=+1068.172539077" watchObservedRunningTime="2025-12-04 15:37:40.753858914 +0000 UTC m=+1068.188528771" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.771767 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8r4vm" podStartSLOduration=14.608715077 podStartE2EDuration="23.771739614s" podCreationTimestamp="2025-12-04 15:37:17 +0000 UTC" firstStartedPulling="2025-12-04 15:37:24.891215104 +0000 UTC m=+1052.325884961" lastFinishedPulling="2025-12-04 15:37:34.054239641 +0000 UTC m=+1061.488909498" observedRunningTime="2025-12-04 15:37:40.756409758 +0000 UTC m=+1068.191079615" watchObservedRunningTime="2025-12-04 15:37:40.771739614 +0000 UTC m=+1068.206409461" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.793137 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.365814089 podStartE2EDuration="25.793108265s" podCreationTimestamp="2025-12-04 15:37:15 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.943051585 +0000 UTC m=+1051.377721442" lastFinishedPulling="2025-12-04 15:37:40.370345761 +0000 UTC m=+1067.805015618" observedRunningTime="2025-12-04 15:37:40.781573309 +0000 UTC m=+1068.216243166" watchObservedRunningTime="2025-12-04 15:37:40.793108265 +0000 UTC m=+1068.227778112" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.970842 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rj748"] Dec 04 15:37:40 crc kubenswrapper[4676]: E1204 15:37:40.973245 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d23694-3775-496d-ba9a-888abb40ea10" containerName="init" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.973353 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d23694-3775-496d-ba9a-888abb40ea10" containerName="init" Dec 04 15:37:40 crc kubenswrapper[4676]: E1204 15:37:40.973426 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerName="dnsmasq-dns" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.973501 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerName="dnsmasq-dns" Dec 04 15:37:40 crc kubenswrapper[4676]: E1204 15:37:40.973577 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerName="init" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.973632 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerName="init" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.973876 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d23694-3775-496d-ba9a-888abb40ea10" containerName="init" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.973980 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6b4fb4d-9b61-414f-a78c-71a143c965d2" containerName="dnsmasq-dns" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.975211 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.977795 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.994718 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ca82100-5ba8-449c-a122-fbc3277ba4d7-ovs-rundir\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.994782 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca82100-5ba8-449c-a122-fbc3277ba4d7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.994821 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm9cb\" (UniqueName: \"kubernetes.io/projected/3ca82100-5ba8-449c-a122-fbc3277ba4d7-kube-api-access-bm9cb\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.994843 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ca82100-5ba8-449c-a122-fbc3277ba4d7-ovn-rundir\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.994857 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca82100-5ba8-449c-a122-fbc3277ba4d7-combined-ca-bundle\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:40 crc kubenswrapper[4676]: I1204 15:37:40.994954 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca82100-5ba8-449c-a122-fbc3277ba4d7-config\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.002241 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rj748"] Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.096512 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca82100-5ba8-449c-a122-fbc3277ba4d7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.096868 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm9cb\" (UniqueName: \"kubernetes.io/projected/3ca82100-5ba8-449c-a122-fbc3277ba4d7-kube-api-access-bm9cb\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.097071 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ca82100-5ba8-449c-a122-fbc3277ba4d7-ovn-rundir\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.097162 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca82100-5ba8-449c-a122-fbc3277ba4d7-combined-ca-bundle\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.097416 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3ca82100-5ba8-449c-a122-fbc3277ba4d7-ovn-rundir\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.097445 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca82100-5ba8-449c-a122-fbc3277ba4d7-config\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.097695 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ca82100-5ba8-449c-a122-fbc3277ba4d7-ovs-rundir\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.097816 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3ca82100-5ba8-449c-a122-fbc3277ba4d7-ovs-rundir\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.098469 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ca82100-5ba8-449c-a122-fbc3277ba4d7-config\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.101069 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca82100-5ba8-449c-a122-fbc3277ba4d7-combined-ca-bundle\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.101102 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ca82100-5ba8-449c-a122-fbc3277ba4d7-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.119657 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-99c545849-x44wf"] Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.121260 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.123815 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.129653 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm9cb\" (UniqueName: \"kubernetes.io/projected/3ca82100-5ba8-449c-a122-fbc3277ba4d7-kube-api-access-bm9cb\") pod \"ovn-controller-metrics-rj748\" (UID: \"3ca82100-5ba8-449c-a122-fbc3277ba4d7\") " pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.141588 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99c545849-x44wf"] Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.166963 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.198853 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-ovsdbserver-nb\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.198893 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-dns-svc\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.198969 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4t4x\" (UniqueName: \"kubernetes.io/projected/6e63df25-c8c4-43be-85eb-858b4d0638ec-kube-api-access-z4t4x\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.199129 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-config\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.216366 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.299497 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.299738 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rj748" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.300267 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-ovsdbserver-nb\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.300326 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-dns-svc\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.300408 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4t4x\" (UniqueName: \"kubernetes.io/projected/6e63df25-c8c4-43be-85eb-858b4d0638ec-kube-api-access-z4t4x\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.300489 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-config\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.301293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-ovsdbserver-nb\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.301320 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-dns-svc\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.301584 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-config\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.332856 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4t4x\" (UniqueName: \"kubernetes.io/projected/6e63df25-c8c4-43be-85eb-858b4d0638ec-kube-api-access-z4t4x\") pod \"dnsmasq-dns-99c545849-x44wf\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.368379 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99c545849-x44wf"] Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.369056 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.404442 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb5995467-h5mqs"] Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.407766 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.412866 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.415367 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb5995467-h5mqs"] Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.605936 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-sb\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.605992 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbplk\" (UniqueName: \"kubernetes.io/projected/59af6e26-0d45-4851-90c6-86aea6fa7c49-kube-api-access-rbplk\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.606020 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-config\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.606296 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-nb\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.606622 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-dns-svc\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.634590 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.710081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-sb\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.710150 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbplk\" (UniqueName: \"kubernetes.io/projected/59af6e26-0d45-4851-90c6-86aea6fa7c49-kube-api-access-rbplk\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.710176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-config\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.710196 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-nb\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.711114 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-dns-svc\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.711142 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-nb\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.711225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-config\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.712131 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-sb\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.712482 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-dns-svc\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.740914 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbplk\" (UniqueName: \"kubernetes.io/projected/59af6e26-0d45-4851-90c6-86aea6fa7c49-kube-api-access-rbplk\") pod \"dnsmasq-dns-cb5995467-h5mqs\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.756382 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c52ad2e5-0a77-4894-8535-30b4e98cdda9","Type":"ContainerStarted","Data":"9ac068edb9aeddebfcf5ed4cfdf541be63ec51c9bf998142d0d14f0963096d07"} Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.757441 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.757479 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.784573 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.914979855 podStartE2EDuration="32.7845549s" podCreationTimestamp="2025-12-04 15:37:09 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.86371232 +0000 UTC m=+1051.298382177" lastFinishedPulling="2025-12-04 15:37:33.733287365 +0000 UTC m=+1061.167957222" observedRunningTime="2025-12-04 15:37:41.784178879 +0000 UTC m=+1069.218848736" watchObservedRunningTime="2025-12-04 15:37:41.7845549 +0000 UTC m=+1069.219224757" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.809599 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.867505 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rj748"] Dec 04 15:37:41 crc kubenswrapper[4676]: W1204 15:37:41.869575 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ca82100_5ba8_449c_a122_fbc3277ba4d7.slice/crio-5b48e83a67c761d06cb954e7370f3d8dc67513871f70739a7311ce65fda0687d WatchSource:0}: Error finding container 5b48e83a67c761d06cb954e7370f3d8dc67513871f70739a7311ce65fda0687d: Status 404 returned error can't find the container with id 5b48e83a67c761d06cb954e7370f3d8dc67513871f70739a7311ce65fda0687d Dec 04 15:37:41 crc kubenswrapper[4676]: I1204 15:37:41.953503 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99c545849-x44wf"] Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.037807 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.300125 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.344280 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.469221 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb5995467-h5mqs"] Dec 04 15:37:42 crc kubenswrapper[4676]: W1204 15:37:42.472042 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59af6e26_0d45_4851_90c6_86aea6fa7c49.slice/crio-1a8512f1e3e4061b3e94e6e9011aadb42dcc015ead9b8ba52af7269d875674f6 WatchSource:0}: Error finding container 1a8512f1e3e4061b3e94e6e9011aadb42dcc015ead9b8ba52af7269d875674f6: Status 404 returned error can't find the container with id 1a8512f1e3e4061b3e94e6e9011aadb42dcc015ead9b8ba52af7269d875674f6 Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.764560 4676 generic.go:334] "Generic (PLEG): container finished" podID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerID="de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5" exitCode=0 Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.764628 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" event={"ID":"59af6e26-0d45-4851-90c6-86aea6fa7c49","Type":"ContainerDied","Data":"de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5"} Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.764892 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" event={"ID":"59af6e26-0d45-4851-90c6-86aea6fa7c49","Type":"ContainerStarted","Data":"1a8512f1e3e4061b3e94e6e9011aadb42dcc015ead9b8ba52af7269d875674f6"} Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.766316 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rj748" event={"ID":"3ca82100-5ba8-449c-a122-fbc3277ba4d7","Type":"ContainerStarted","Data":"204efe237c07658a978b4ec0e257d9ed9532942c74080ed287c27dbc9d1d7d46"} Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.766346 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rj748" event={"ID":"3ca82100-5ba8-449c-a122-fbc3277ba4d7","Type":"ContainerStarted","Data":"5b48e83a67c761d06cb954e7370f3d8dc67513871f70739a7311ce65fda0687d"} Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.773658 4676 generic.go:334] "Generic (PLEG): container finished" podID="6e63df25-c8c4-43be-85eb-858b4d0638ec" containerID="55d508affd40951ccafd5623e6d5ad3a645e1e2746f7ed6c9fde22d2d360b18e" exitCode=0 Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.773780 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99c545849-x44wf" event={"ID":"6e63df25-c8c4-43be-85eb-858b4d0638ec","Type":"ContainerDied","Data":"55d508affd40951ccafd5623e6d5ad3a645e1e2746f7ed6c9fde22d2d360b18e"} Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.773807 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99c545849-x44wf" event={"ID":"6e63df25-c8c4-43be-85eb-858b4d0638ec","Type":"ContainerStarted","Data":"eb7e7320c18b00af48ee5d83f80fa4ece487987b3aad93f721bf5ae414b24313"} Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.776592 4676 generic.go:334] "Generic (PLEG): container finished" podID="3588a213-92d7-43d7-8a28-6a9104f1d48e" containerID="0f58bd77c81dabe6a9838b0f54fa6f143a7b76405935823f03d620ee6ea2aed2" exitCode=0 Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.776763 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3588a213-92d7-43d7-8a28-6a9104f1d48e","Type":"ContainerDied","Data":"0f58bd77c81dabe6a9838b0f54fa6f143a7b76405935823f03d620ee6ea2aed2"} Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.813859 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rj748" podStartSLOduration=2.813832445 podStartE2EDuration="2.813832445s" podCreationTimestamp="2025-12-04 15:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:37:42.813282819 +0000 UTC m=+1070.247952686" watchObservedRunningTime="2025-12-04 15:37:42.813832445 +0000 UTC m=+1070.248502302" Dec 04 15:37:42 crc kubenswrapper[4676]: I1204 15:37:42.837594 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.172186 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.254697 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-config\") pod \"6e63df25-c8c4-43be-85eb-858b4d0638ec\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.254971 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-ovsdbserver-nb\") pod \"6e63df25-c8c4-43be-85eb-858b4d0638ec\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.255014 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-dns-svc\") pod \"6e63df25-c8c4-43be-85eb-858b4d0638ec\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.255057 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4t4x\" (UniqueName: \"kubernetes.io/projected/6e63df25-c8c4-43be-85eb-858b4d0638ec-kube-api-access-z4t4x\") pod \"6e63df25-c8c4-43be-85eb-858b4d0638ec\" (UID: \"6e63df25-c8c4-43be-85eb-858b4d0638ec\") " Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.261976 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e63df25-c8c4-43be-85eb-858b4d0638ec-kube-api-access-z4t4x" (OuterVolumeSpecName: "kube-api-access-z4t4x") pod "6e63df25-c8c4-43be-85eb-858b4d0638ec" (UID: "6e63df25-c8c4-43be-85eb-858b4d0638ec"). InnerVolumeSpecName "kube-api-access-z4t4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.293337 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6e63df25-c8c4-43be-85eb-858b4d0638ec" (UID: "6e63df25-c8c4-43be-85eb-858b4d0638ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.316721 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-config" (OuterVolumeSpecName: "config") pod "6e63df25-c8c4-43be-85eb-858b4d0638ec" (UID: "6e63df25-c8c4-43be-85eb-858b4d0638ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.359024 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.359063 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4t4x\" (UniqueName: \"kubernetes.io/projected/6e63df25-c8c4-43be-85eb-858b4d0638ec-kube-api-access-z4t4x\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.359076 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.370468 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6e63df25-c8c4-43be-85eb-858b4d0638ec" (UID: "6e63df25-c8c4-43be-85eb-858b4d0638ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.415116 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 04 15:37:43 crc kubenswrapper[4676]: E1204 15:37:43.415412 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e63df25-c8c4-43be-85eb-858b4d0638ec" containerName="init" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.415424 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e63df25-c8c4-43be-85eb-858b4d0638ec" containerName="init" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.415607 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e63df25-c8c4-43be-85eb-858b4d0638ec" containerName="init" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.416578 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.431538 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.440403 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lrvnt" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.440765 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.441020 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.461952 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6e63df25-c8c4-43be-85eb-858b4d0638ec-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.467725 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.521236 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb5995467-h5mqs"] Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.523802 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.573953 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/401b9eed-f3f4-4794-bab2-83bc5fd89deb-scripts\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.574000 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6sqq\" (UniqueName: \"kubernetes.io/projected/401b9eed-f3f4-4794-bab2-83bc5fd89deb-kube-api-access-g6sqq\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.574033 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b9eed-f3f4-4794-bab2-83bc5fd89deb-config\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.574063 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.574084 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/401b9eed-f3f4-4794-bab2-83bc5fd89deb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.574157 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.574179 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.642556 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c4f5b9f5-bvf7v"] Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.646595 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.676220 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/401b9eed-f3f4-4794-bab2-83bc5fd89deb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.676325 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.676353 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.676393 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/401b9eed-f3f4-4794-bab2-83bc5fd89deb-scripts\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.676410 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6sqq\" (UniqueName: \"kubernetes.io/projected/401b9eed-f3f4-4794-bab2-83bc5fd89deb-kube-api-access-g6sqq\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.676435 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b9eed-f3f4-4794-bab2-83bc5fd89deb-config\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.676466 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.678011 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/401b9eed-f3f4-4794-bab2-83bc5fd89deb-config\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.678134 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/401b9eed-f3f4-4794-bab2-83bc5fd89deb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.678940 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/401b9eed-f3f4-4794-bab2-83bc5fd89deb-scripts\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.682462 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.683691 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.692202 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c4f5b9f5-bvf7v"] Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.695215 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/401b9eed-f3f4-4794-bab2-83bc5fd89deb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.710686 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6sqq\" (UniqueName: \"kubernetes.io/projected/401b9eed-f3f4-4794-bab2-83bc5fd89deb-kube-api-access-g6sqq\") pod \"ovn-northd-0\" (UID: \"401b9eed-f3f4-4794-bab2-83bc5fd89deb\") " pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.748465 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.778391 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-config\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.778491 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-sb\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.778523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9rk\" (UniqueName: \"kubernetes.io/projected/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-kube-api-access-5j9rk\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.778564 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-dns-svc\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.778685 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-nb\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.800331 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" event={"ID":"59af6e26-0d45-4851-90c6-86aea6fa7c49","Type":"ContainerStarted","Data":"de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88"} Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.800407 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.805898 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99c545849-x44wf" event={"ID":"6e63df25-c8c4-43be-85eb-858b4d0638ec","Type":"ContainerDied","Data":"eb7e7320c18b00af48ee5d83f80fa4ece487987b3aad93f721bf5ae414b24313"} Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.806013 4676 scope.go:117] "RemoveContainer" containerID="55d508affd40951ccafd5623e6d5ad3a645e1e2746f7ed6c9fde22d2d360b18e" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.806278 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99c545849-x44wf" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.811383 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3588a213-92d7-43d7-8a28-6a9104f1d48e","Type":"ContainerStarted","Data":"dc2ea21f90c0f51efcf8cef74902eee94bb80d264e153e98e5d17eca7450ee96"} Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.815977 4676 generic.go:334] "Generic (PLEG): container finished" podID="c83d9914-203c-4a22-a92f-80851859fd48" containerID="7dcb20f95ba2c36d461fad9df709170e9632819ba97a81bff20c81a9af750c0b" exitCode=0 Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.816114 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerDied","Data":"7dcb20f95ba2c36d461fad9df709170e9632819ba97a81bff20c81a9af750c0b"} Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.843526 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" podStartSLOduration=2.843501532 podStartE2EDuration="2.843501532s" podCreationTimestamp="2025-12-04 15:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:37:43.823093589 +0000 UTC m=+1071.257763456" watchObservedRunningTime="2025-12-04 15:37:43.843501532 +0000 UTC m=+1071.278171389" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.874985 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99c545849-x44wf"] Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.879899 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-sb\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.879972 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9rk\" (UniqueName: \"kubernetes.io/projected/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-kube-api-access-5j9rk\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.880084 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-dns-svc\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.880127 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-nb\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.880235 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-config\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.881289 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-sb\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.882517 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-dns-svc\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.883380 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-config\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.885494 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-nb\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.886897 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-99c545849-x44wf"] Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.910338 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9rk\" (UniqueName: \"kubernetes.io/projected/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-kube-api-access-5j9rk\") pod \"dnsmasq-dns-65c4f5b9f5-bvf7v\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.923555 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.849964405 podStartE2EDuration="35.923527807s" podCreationTimestamp="2025-12-04 15:37:08 +0000 UTC" firstStartedPulling="2025-12-04 15:37:24.136598739 +0000 UTC m=+1051.571268586" lastFinishedPulling="2025-12-04 15:37:34.210162131 +0000 UTC m=+1061.644831988" observedRunningTime="2025-12-04 15:37:43.911028504 +0000 UTC m=+1071.345698361" watchObservedRunningTime="2025-12-04 15:37:43.923527807 +0000 UTC m=+1071.358197674" Dec 04 15:37:43 crc kubenswrapper[4676]: I1204 15:37:43.977537 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.287385 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.442082 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c4f5b9f5-bvf7v"] Dec 04 15:37:44 crc kubenswrapper[4676]: W1204 15:37:44.456535 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9bc34e2_332b_4bb5_bb8f_dc5e3992be13.slice/crio-74e996b868cd08e29ec9973f8980ed0733fcb88cf705ef1a58d5367a7e17f63f WatchSource:0}: Error finding container 74e996b868cd08e29ec9973f8980ed0733fcb88cf705ef1a58d5367a7e17f63f: Status 404 returned error can't find the container with id 74e996b868cd08e29ec9973f8980ed0733fcb88cf705ef1a58d5367a7e17f63f Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.658429 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.674894 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.677631 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.677717 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.677869 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rkv6b" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.678561 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.684246 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.794118 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/61ed17c4-ad81-4738-ac71-3b97f42d5211-lock\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.794194 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/61ed17c4-ad81-4738-ac71-3b97f42d5211-cache\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.794312 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwwql\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-kube-api-access-hwwql\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.794365 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.794391 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.840724 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"401b9eed-f3f4-4794-bab2-83bc5fd89deb","Type":"ContainerStarted","Data":"2357f4813ba43a95302aa311d2e4f2c79635a25813c85698fc1017112a205a0f"} Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.843212 4676 generic.go:334] "Generic (PLEG): container finished" podID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerID="ea3be10bae902b06928a512ae90832e1a78f9cca3811a12c5659ffa5d80f6c65" exitCode=0 Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.843312 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" event={"ID":"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13","Type":"ContainerDied","Data":"ea3be10bae902b06928a512ae90832e1a78f9cca3811a12c5659ffa5d80f6c65"} Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.843600 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" event={"ID":"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13","Type":"ContainerStarted","Data":"74e996b868cd08e29ec9973f8980ed0733fcb88cf705ef1a58d5367a7e17f63f"} Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.846613 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" podUID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerName="dnsmasq-dns" containerID="cri-o://de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88" gracePeriod=10 Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.895844 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/61ed17c4-ad81-4738-ac71-3b97f42d5211-cache\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.896309 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwwql\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-kube-api-access-hwwql\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.896401 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/61ed17c4-ad81-4738-ac71-3b97f42d5211-cache\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.896603 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.896714 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.896948 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/61ed17c4-ad81-4738-ac71-3b97f42d5211-lock\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.897465 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/61ed17c4-ad81-4738-ac71-3b97f42d5211-lock\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.897632 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: E1204 15:37:44.897805 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 15:37:44 crc kubenswrapper[4676]: E1204 15:37:44.897928 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 15:37:44 crc kubenswrapper[4676]: E1204 15:37:44.898200 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift podName:61ed17c4-ad81-4738-ac71-3b97f42d5211 nodeName:}" failed. No retries permitted until 2025-12-04 15:37:45.398168025 +0000 UTC m=+1072.832837872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift") pod "swift-storage-0" (UID: "61ed17c4-ad81-4738-ac71-3b97f42d5211") : configmap "swift-ring-files" not found Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.919437 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwwql\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-kube-api-access-hwwql\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:44 crc kubenswrapper[4676]: I1204 15:37:44.920171 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.377471 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.398829 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e63df25-c8c4-43be-85eb-858b4d0638ec" path="/var/lib/kubelet/pods/6e63df25-c8c4-43be-85eb-858b4d0638ec/volumes" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.403730 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-dns-svc\") pod \"59af6e26-0d45-4851-90c6-86aea6fa7c49\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.403817 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-config\") pod \"59af6e26-0d45-4851-90c6-86aea6fa7c49\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.403859 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-sb\") pod \"59af6e26-0d45-4851-90c6-86aea6fa7c49\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.404066 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-nb\") pod \"59af6e26-0d45-4851-90c6-86aea6fa7c49\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.404699 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbplk\" (UniqueName: \"kubernetes.io/projected/59af6e26-0d45-4851-90c6-86aea6fa7c49-kube-api-access-rbplk\") pod \"59af6e26-0d45-4851-90c6-86aea6fa7c49\" (UID: \"59af6e26-0d45-4851-90c6-86aea6fa7c49\") " Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.406468 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:45 crc kubenswrapper[4676]: E1204 15:37:45.413196 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 15:37:45 crc kubenswrapper[4676]: E1204 15:37:45.413236 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 15:37:45 crc kubenswrapper[4676]: E1204 15:37:45.413276 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift podName:61ed17c4-ad81-4738-ac71-3b97f42d5211 nodeName:}" failed. No retries permitted until 2025-12-04 15:37:46.41325973 +0000 UTC m=+1073.847929587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift") pod "swift-storage-0" (UID: "61ed17c4-ad81-4738-ac71-3b97f42d5211") : configmap "swift-ring-files" not found Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.431640 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59af6e26-0d45-4851-90c6-86aea6fa7c49-kube-api-access-rbplk" (OuterVolumeSpecName: "kube-api-access-rbplk") pod "59af6e26-0d45-4851-90c6-86aea6fa7c49" (UID: "59af6e26-0d45-4851-90c6-86aea6fa7c49"). InnerVolumeSpecName "kube-api-access-rbplk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.478416 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59af6e26-0d45-4851-90c6-86aea6fa7c49" (UID: "59af6e26-0d45-4851-90c6-86aea6fa7c49"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.487744 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59af6e26-0d45-4851-90c6-86aea6fa7c49" (UID: "59af6e26-0d45-4851-90c6-86aea6fa7c49"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.497679 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59af6e26-0d45-4851-90c6-86aea6fa7c49" (UID: "59af6e26-0d45-4851-90c6-86aea6fa7c49"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.509273 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbplk\" (UniqueName: \"kubernetes.io/projected/59af6e26-0d45-4851-90c6-86aea6fa7c49-kube-api-access-rbplk\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.509315 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.509327 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.509341 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.521936 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-config" (OuterVolumeSpecName: "config") pod "59af6e26-0d45-4851-90c6-86aea6fa7c49" (UID: "59af6e26-0d45-4851-90c6-86aea6fa7c49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.611085 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59af6e26-0d45-4851-90c6-86aea6fa7c49-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.859393 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" event={"ID":"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13","Type":"ContainerStarted","Data":"9635365c5da448fbfb1e015d65b56f91ae595b8cbdb34438229c451f1a235dbd"} Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.859525 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.861778 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"401b9eed-f3f4-4794-bab2-83bc5fd89deb","Type":"ContainerStarted","Data":"d7cc93572cb5bfecd3836da46e5a1c98f5f691f6af6fac2b052aa8624fc024fc"} Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.861810 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"401b9eed-f3f4-4794-bab2-83bc5fd89deb","Type":"ContainerStarted","Data":"2a7e8bef8d6067f1654ad11bb946e77c4484360c7b3dfa0552929ff8c13fe1ae"} Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.861960 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.863969 4676 generic.go:334] "Generic (PLEG): container finished" podID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerID="de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88" exitCode=0 Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.864010 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" event={"ID":"59af6e26-0d45-4851-90c6-86aea6fa7c49","Type":"ContainerDied","Data":"de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88"} Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.864037 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" event={"ID":"59af6e26-0d45-4851-90c6-86aea6fa7c49","Type":"ContainerDied","Data":"1a8512f1e3e4061b3e94e6e9011aadb42dcc015ead9b8ba52af7269d875674f6"} Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.864037 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb5995467-h5mqs" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.864055 4676 scope.go:117] "RemoveContainer" containerID="de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.883443 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" podStartSLOduration=2.88342453 podStartE2EDuration="2.88342453s" podCreationTimestamp="2025-12-04 15:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:37:45.878073795 +0000 UTC m=+1073.312743662" watchObservedRunningTime="2025-12-04 15:37:45.88342453 +0000 UTC m=+1073.318094387" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.903342 4676 scope.go:117] "RemoveContainer" containerID="de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.911255 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb5995467-h5mqs"] Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.923005 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb5995467-h5mqs"] Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.932182 4676 scope.go:117] "RemoveContainer" containerID="de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88" Dec 04 15:37:45 crc kubenswrapper[4676]: E1204 15:37:45.935578 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88\": container with ID starting with de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88 not found: ID does not exist" containerID="de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.935662 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88"} err="failed to get container status \"de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88\": rpc error: code = NotFound desc = could not find container \"de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88\": container with ID starting with de5807f2c56020c6f0ec358cb0bd3fa8c38282712a0bb952683513dca63c3f88 not found: ID does not exist" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.935697 4676 scope.go:117] "RemoveContainer" containerID="de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5" Dec 04 15:37:45 crc kubenswrapper[4676]: E1204 15:37:45.942080 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5\": container with ID starting with de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5 not found: ID does not exist" containerID="de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.942213 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5"} err="failed to get container status \"de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5\": rpc error: code = NotFound desc = could not find container \"de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5\": container with ID starting with de373cfcd070f31b3be6af2de912d16477153e064379770cb5eb8c45005914e5 not found: ID does not exist" Dec 04 15:37:45 crc kubenswrapper[4676]: I1204 15:37:45.945088 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.098850246 podStartE2EDuration="2.945057791s" podCreationTimestamp="2025-12-04 15:37:43 +0000 UTC" firstStartedPulling="2025-12-04 15:37:44.301822348 +0000 UTC m=+1071.736492205" lastFinishedPulling="2025-12-04 15:37:45.148029893 +0000 UTC m=+1072.582699750" observedRunningTime="2025-12-04 15:37:45.924785112 +0000 UTC m=+1073.359454969" watchObservedRunningTime="2025-12-04 15:37:45.945057791 +0000 UTC m=+1073.379727648" Dec 04 15:37:46 crc kubenswrapper[4676]: I1204 15:37:46.026643 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:37:46 crc kubenswrapper[4676]: I1204 15:37:46.026720 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:37:46 crc kubenswrapper[4676]: E1204 15:37:46.428589 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 15:37:46 crc kubenswrapper[4676]: E1204 15:37:46.428833 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 15:37:46 crc kubenswrapper[4676]: E1204 15:37:46.428899 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift podName:61ed17c4-ad81-4738-ac71-3b97f42d5211 nodeName:}" failed. No retries permitted until 2025-12-04 15:37:48.428875728 +0000 UTC m=+1075.863545585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift") pod "swift-storage-0" (UID: "61ed17c4-ad81-4738-ac71-3b97f42d5211") : configmap "swift-ring-files" not found Dec 04 15:37:46 crc kubenswrapper[4676]: I1204 15:37:46.428118 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:47 crc kubenswrapper[4676]: E1204 15:37:47.074995 4676 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:41762->38.102.83.158:40877: write tcp 38.102.83.158:41762->38.102.83.158:40877: write: broken pipe Dec 04 15:37:47 crc kubenswrapper[4676]: I1204 15:37:47.393634 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59af6e26-0d45-4851-90c6-86aea6fa7c49" path="/var/lib/kubelet/pods/59af6e26-0d45-4851-90c6-86aea6fa7c49/volumes" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.459951 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:48 crc kubenswrapper[4676]: E1204 15:37:48.460609 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 15:37:48 crc kubenswrapper[4676]: E1204 15:37:48.460629 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 15:37:48 crc kubenswrapper[4676]: E1204 15:37:48.460683 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift podName:61ed17c4-ad81-4738-ac71-3b97f42d5211 nodeName:}" failed. No retries permitted until 2025-12-04 15:37:52.460663851 +0000 UTC m=+1079.895333708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift") pod "swift-storage-0" (UID: "61ed17c4-ad81-4738-ac71-3b97f42d5211") : configmap "swift-ring-files" not found Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.476591 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ksj54"] Dec 04 15:37:48 crc kubenswrapper[4676]: E1204 15:37:48.477050 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerName="dnsmasq-dns" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.477078 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerName="dnsmasq-dns" Dec 04 15:37:48 crc kubenswrapper[4676]: E1204 15:37:48.477116 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerName="init" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.477126 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerName="init" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.477339 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="59af6e26-0d45-4851-90c6-86aea6fa7c49" containerName="dnsmasq-dns" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.478131 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.480717 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.481286 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.489429 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.497741 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ksj54"] Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.662981 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpsd7\" (UniqueName: \"kubernetes.io/projected/e03c083b-3422-4f69-9355-7e8354125352-kube-api-access-fpsd7\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.663081 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-dispersionconf\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.663423 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-scripts\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.663563 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-swiftconf\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.663797 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-combined-ca-bundle\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.663978 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-ring-data-devices\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.664073 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03c083b-3422-4f69-9355-7e8354125352-etc-swift\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.765240 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-scripts\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.765291 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-swiftconf\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.765341 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-combined-ca-bundle\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.765382 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-ring-data-devices\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.765406 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03c083b-3422-4f69-9355-7e8354125352-etc-swift\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.765639 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpsd7\" (UniqueName: \"kubernetes.io/projected/e03c083b-3422-4f69-9355-7e8354125352-kube-api-access-fpsd7\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.765682 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-dispersionconf\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.766344 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03c083b-3422-4f69-9355-7e8354125352-etc-swift\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.766892 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-ring-data-devices\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.767288 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-scripts\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.772817 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-swiftconf\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.773085 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-combined-ca-bundle\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.774379 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-dispersionconf\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.783624 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpsd7\" (UniqueName: \"kubernetes.io/projected/e03c083b-3422-4f69-9355-7e8354125352-kube-api-access-fpsd7\") pod \"swift-ring-rebalance-ksj54\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:48 crc kubenswrapper[4676]: I1204 15:37:48.796538 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:37:49 crc kubenswrapper[4676]: I1204 15:37:49.953876 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 04 15:37:49 crc kubenswrapper[4676]: I1204 15:37:49.954226 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 04 15:37:50 crc kubenswrapper[4676]: I1204 15:37:50.020588 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 04 15:37:50 crc kubenswrapper[4676]: I1204 15:37:50.401426 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-ksj54"] Dec 04 15:37:50 crc kubenswrapper[4676]: W1204 15:37:50.407093 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode03c083b_3422_4f69_9355_7e8354125352.slice/crio-ca2b152fcc0f287aa97e8229e644734a9a49e933dfc2e00294366bff57111b41 WatchSource:0}: Error finding container ca2b152fcc0f287aa97e8229e644734a9a49e933dfc2e00294366bff57111b41: Status 404 returned error can't find the container with id ca2b152fcc0f287aa97e8229e644734a9a49e933dfc2e00294366bff57111b41 Dec 04 15:37:50 crc kubenswrapper[4676]: I1204 15:37:50.914493 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ksj54" event={"ID":"e03c083b-3422-4f69-9355-7e8354125352","Type":"ContainerStarted","Data":"ca2b152fcc0f287aa97e8229e644734a9a49e933dfc2e00294366bff57111b41"} Dec 04 15:37:50 crc kubenswrapper[4676]: I1204 15:37:50.984948 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.321247 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.321596 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.415945 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.529927 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5bnn8"] Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.531190 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bnn8" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.541446 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5bnn8"] Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.618063 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-845xd\" (UniqueName: \"kubernetes.io/projected/01d480ec-6f21-494a-b5b6-d58c1842077d-kube-api-access-845xd\") pod \"placement-db-create-5bnn8\" (UID: \"01d480ec-6f21-494a-b5b6-d58c1842077d\") " pod="openstack/placement-db-create-5bnn8" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.720700 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-845xd\" (UniqueName: \"kubernetes.io/projected/01d480ec-6f21-494a-b5b6-d58c1842077d-kube-api-access-845xd\") pod \"placement-db-create-5bnn8\" (UID: \"01d480ec-6f21-494a-b5b6-d58c1842077d\") " pod="openstack/placement-db-create-5bnn8" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.744316 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-845xd\" (UniqueName: \"kubernetes.io/projected/01d480ec-6f21-494a-b5b6-d58c1842077d-kube-api-access-845xd\") pod \"placement-db-create-5bnn8\" (UID: \"01d480ec-6f21-494a-b5b6-d58c1842077d\") " pod="openstack/placement-db-create-5bnn8" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.868653 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bnn8" Dec 04 15:37:51 crc kubenswrapper[4676]: I1204 15:37:51.936989 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerStarted","Data":"e0cb13f40dccb5fead31bf4eb65dffd748aa720a5ee83c48c9486a82f5dc88ce"} Dec 04 15:37:52 crc kubenswrapper[4676]: I1204 15:37:52.090705 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 04 15:37:52 crc kubenswrapper[4676]: I1204 15:37:52.411379 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5bnn8"] Dec 04 15:37:52 crc kubenswrapper[4676]: W1204 15:37:52.417130 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d480ec_6f21_494a_b5b6_d58c1842077d.slice/crio-84c0cd85c5f7fc845f9138f9794ca5d650805f09d14bd8b932799727b8e61994 WatchSource:0}: Error finding container 84c0cd85c5f7fc845f9138f9794ca5d650805f09d14bd8b932799727b8e61994: Status 404 returned error can't find the container with id 84c0cd85c5f7fc845f9138f9794ca5d650805f09d14bd8b932799727b8e61994 Dec 04 15:37:52 crc kubenswrapper[4676]: I1204 15:37:52.538089 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:37:52 crc kubenswrapper[4676]: E1204 15:37:52.538803 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 15:37:52 crc kubenswrapper[4676]: E1204 15:37:52.538848 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 15:37:52 crc kubenswrapper[4676]: E1204 15:37:52.539170 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift podName:61ed17c4-ad81-4738-ac71-3b97f42d5211 nodeName:}" failed. No retries permitted until 2025-12-04 15:38:00.538899719 +0000 UTC m=+1087.973569576 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift") pod "swift-storage-0" (UID: "61ed17c4-ad81-4738-ac71-3b97f42d5211") : configmap "swift-ring-files" not found Dec 04 15:37:52 crc kubenswrapper[4676]: I1204 15:37:52.945202 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5bnn8" event={"ID":"01d480ec-6f21-494a-b5b6-d58c1842077d","Type":"ContainerStarted","Data":"84c0cd85c5f7fc845f9138f9794ca5d650805f09d14bd8b932799727b8e61994"} Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.374890 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-wthdz"] Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.376222 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wthdz" Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.408864 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-wthdz"] Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.453265 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqmxf\" (UniqueName: \"kubernetes.io/projected/a0c18d40-c03a-4c87-aa2c-ad743179dd6f-kube-api-access-lqmxf\") pod \"watcher-db-create-wthdz\" (UID: \"a0c18d40-c03a-4c87-aa2c-ad743179dd6f\") " pod="openstack/watcher-db-create-wthdz" Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.555704 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqmxf\" (UniqueName: \"kubernetes.io/projected/a0c18d40-c03a-4c87-aa2c-ad743179dd6f-kube-api-access-lqmxf\") pod \"watcher-db-create-wthdz\" (UID: \"a0c18d40-c03a-4c87-aa2c-ad743179dd6f\") " pod="openstack/watcher-db-create-wthdz" Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.574403 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqmxf\" (UniqueName: \"kubernetes.io/projected/a0c18d40-c03a-4c87-aa2c-ad743179dd6f-kube-api-access-lqmxf\") pod \"watcher-db-create-wthdz\" (UID: \"a0c18d40-c03a-4c87-aa2c-ad743179dd6f\") " pod="openstack/watcher-db-create-wthdz" Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.701269 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wthdz" Dec 04 15:37:53 crc kubenswrapper[4676]: I1204 15:37:53.981083 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:37:54 crc kubenswrapper[4676]: I1204 15:37:54.033564 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b7696bc7-6t68r"] Dec 04 15:37:54 crc kubenswrapper[4676]: I1204 15:37:54.034129 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" containerName="dnsmasq-dns" containerID="cri-o://4be8f5776f5b5a8419db302f71216b959f822b3d4354c9ee10f9042623971077" gracePeriod=10 Dec 04 15:37:54 crc kubenswrapper[4676]: I1204 15:37:54.220841 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-wthdz"] Dec 04 15:37:54 crc kubenswrapper[4676]: I1204 15:37:54.965384 4676 generic.go:334] "Generic (PLEG): container finished" podID="485b242f-88d0-4521-a25c-e9a957a58e19" containerID="4be8f5776f5b5a8419db302f71216b959f822b3d4354c9ee10f9042623971077" exitCode=0 Dec 04 15:37:54 crc kubenswrapper[4676]: I1204 15:37:54.965480 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" event={"ID":"485b242f-88d0-4521-a25c-e9a957a58e19","Type":"ContainerDied","Data":"4be8f5776f5b5a8419db302f71216b959f822b3d4354c9ee10f9042623971077"} Dec 04 15:37:54 crc kubenswrapper[4676]: I1204 15:37:54.967109 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wthdz" event={"ID":"a0c18d40-c03a-4c87-aa2c-ad743179dd6f","Type":"ContainerStarted","Data":"6a9d44b7116b58a3b03690aeb5ea1eace5ee8a1081b78a597c9fba104c124e5c"} Dec 04 15:37:56 crc kubenswrapper[4676]: I1204 15:37:56.985115 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerStarted","Data":"792e5fc1ea1f84be22980db1caeb6f4cee61e88cb54981aea5250f563e98dd20"} Dec 04 15:37:56 crc kubenswrapper[4676]: I1204 15:37:56.987155 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5bnn8" event={"ID":"01d480ec-6f21-494a-b5b6-d58c1842077d","Type":"ContainerStarted","Data":"60ff3b9eb0e5b32f3f88a2b5a018541eb684066e81047d95f9f5804ef5698b36"} Dec 04 15:37:57 crc kubenswrapper[4676]: I1204 15:37:57.006239 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-5bnn8" podStartSLOduration=6.006217244 podStartE2EDuration="6.006217244s" podCreationTimestamp="2025-12-04 15:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:37:57.00334751 +0000 UTC m=+1084.438017377" watchObservedRunningTime="2025-12-04 15:37:57.006217244 +0000 UTC m=+1084.440887101" Dec 04 15:37:57 crc kubenswrapper[4676]: I1204 15:37:57.160021 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.104:5353: connect: connection refused" Dec 04 15:37:57 crc kubenswrapper[4676]: I1204 15:37:57.999195 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wthdz" event={"ID":"a0c18d40-c03a-4c87-aa2c-ad743179dd6f","Type":"ContainerStarted","Data":"44bcde57ce210f1f46a6edbc76309d1f472463aa1c1d13dc7f10b8a8e30431f8"} Dec 04 15:37:58 crc kubenswrapper[4676]: I1204 15:37:58.043212 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-create-wthdz" podStartSLOduration=5.043187263 podStartE2EDuration="5.043187263s" podCreationTimestamp="2025-12-04 15:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:37:58.033933254 +0000 UTC m=+1085.468603111" watchObservedRunningTime="2025-12-04 15:37:58.043187263 +0000 UTC m=+1085.477857120" Dec 04 15:37:58 crc kubenswrapper[4676]: I1204 15:37:58.810077 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.009491 4676 generic.go:334] "Generic (PLEG): container finished" podID="a0c18d40-c03a-4c87-aa2c-ad743179dd6f" containerID="44bcde57ce210f1f46a6edbc76309d1f472463aa1c1d13dc7f10b8a8e30431f8" exitCode=0 Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.009572 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wthdz" event={"ID":"a0c18d40-c03a-4c87-aa2c-ad743179dd6f","Type":"ContainerDied","Data":"44bcde57ce210f1f46a6edbc76309d1f472463aa1c1d13dc7f10b8a8e30431f8"} Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.012010 4676 generic.go:334] "Generic (PLEG): container finished" podID="01d480ec-6f21-494a-b5b6-d58c1842077d" containerID="60ff3b9eb0e5b32f3f88a2b5a018541eb684066e81047d95f9f5804ef5698b36" exitCode=0 Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.012063 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5bnn8" event={"ID":"01d480ec-6f21-494a-b5b6-d58c1842077d","Type":"ContainerDied","Data":"60ff3b9eb0e5b32f3f88a2b5a018541eb684066e81047d95f9f5804ef5698b36"} Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.136507 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.266860 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-dns-svc\") pod \"485b242f-88d0-4521-a25c-e9a957a58e19\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.266999 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp248\" (UniqueName: \"kubernetes.io/projected/485b242f-88d0-4521-a25c-e9a957a58e19-kube-api-access-tp248\") pod \"485b242f-88d0-4521-a25c-e9a957a58e19\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.267115 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-config\") pod \"485b242f-88d0-4521-a25c-e9a957a58e19\" (UID: \"485b242f-88d0-4521-a25c-e9a957a58e19\") " Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.273639 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485b242f-88d0-4521-a25c-e9a957a58e19-kube-api-access-tp248" (OuterVolumeSpecName: "kube-api-access-tp248") pod "485b242f-88d0-4521-a25c-e9a957a58e19" (UID: "485b242f-88d0-4521-a25c-e9a957a58e19"). InnerVolumeSpecName "kube-api-access-tp248". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.314004 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "485b242f-88d0-4521-a25c-e9a957a58e19" (UID: "485b242f-88d0-4521-a25c-e9a957a58e19"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.318679 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-config" (OuterVolumeSpecName: "config") pod "485b242f-88d0-4521-a25c-e9a957a58e19" (UID: "485b242f-88d0-4521-a25c-e9a957a58e19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.370180 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp248\" (UniqueName: \"kubernetes.io/projected/485b242f-88d0-4521-a25c-e9a957a58e19-kube-api-access-tp248\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.370474 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:37:59 crc kubenswrapper[4676]: I1204 15:37:59.370485 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/485b242f-88d0-4521-a25c-e9a957a58e19-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:00 crc kubenswrapper[4676]: I1204 15:38:00.022390 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" event={"ID":"485b242f-88d0-4521-a25c-e9a957a58e19","Type":"ContainerDied","Data":"188fbfc406ca281b17eefc85ca916ee21694f76f209653d75608a0f751c2a49c"} Dec 04 15:38:00 crc kubenswrapper[4676]: I1204 15:38:00.022475 4676 scope.go:117] "RemoveContainer" containerID="4be8f5776f5b5a8419db302f71216b959f822b3d4354c9ee10f9042623971077" Dec 04 15:38:00 crc kubenswrapper[4676]: I1204 15:38:00.022726 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b7696bc7-6t68r" Dec 04 15:38:00 crc kubenswrapper[4676]: I1204 15:38:00.050143 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b7696bc7-6t68r"] Dec 04 15:38:00 crc kubenswrapper[4676]: I1204 15:38:00.061043 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b7696bc7-6t68r"] Dec 04 15:38:00 crc kubenswrapper[4676]: I1204 15:38:00.593104 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:38:00 crc kubenswrapper[4676]: E1204 15:38:00.593349 4676 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 04 15:38:00 crc kubenswrapper[4676]: E1204 15:38:00.593378 4676 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 04 15:38:00 crc kubenswrapper[4676]: E1204 15:38:00.593443 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift podName:61ed17c4-ad81-4738-ac71-3b97f42d5211 nodeName:}" failed. No retries permitted until 2025-12-04 15:38:16.593419618 +0000 UTC m=+1104.028089475 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift") pod "swift-storage-0" (UID: "61ed17c4-ad81-4738-ac71-3b97f42d5211") : configmap "swift-ring-files" not found Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.128279 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zps4k"] Dec 04 15:38:01 crc kubenswrapper[4676]: E1204 15:38:01.128747 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" containerName="dnsmasq-dns" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.128769 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" containerName="dnsmasq-dns" Dec 04 15:38:01 crc kubenswrapper[4676]: E1204 15:38:01.128802 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" containerName="init" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.128809 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" containerName="init" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.129092 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" containerName="dnsmasq-dns" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.129721 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zps4k" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.155350 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zps4k"] Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.305215 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4nnw\" (UniqueName: \"kubernetes.io/projected/42384168-5df1-4d2c-aec1-501e67ceb44e-kube-api-access-p4nnw\") pod \"keystone-db-create-zps4k\" (UID: \"42384168-5df1-4d2c-aec1-501e67ceb44e\") " pod="openstack/keystone-db-create-zps4k" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.397595 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485b242f-88d0-4521-a25c-e9a957a58e19" path="/var/lib/kubelet/pods/485b242f-88d0-4521-a25c-e9a957a58e19/volumes" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.407449 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4nnw\" (UniqueName: \"kubernetes.io/projected/42384168-5df1-4d2c-aec1-501e67ceb44e-kube-api-access-p4nnw\") pod \"keystone-db-create-zps4k\" (UID: \"42384168-5df1-4d2c-aec1-501e67ceb44e\") " pod="openstack/keystone-db-create-zps4k" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.429592 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4nnw\" (UniqueName: \"kubernetes.io/projected/42384168-5df1-4d2c-aec1-501e67ceb44e-kube-api-access-p4nnw\") pod \"keystone-db-create-zps4k\" (UID: \"42384168-5df1-4d2c-aec1-501e67ceb44e\") " pod="openstack/keystone-db-create-zps4k" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.452591 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zps4k" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.481626 4676 scope.go:117] "RemoveContainer" containerID="655f4974c5d474c3ce92d089cfdd7cc1363536c57807541d0a3429c5aa031a56" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.659957 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bnn8" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.676768 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wthdz" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.817127 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqmxf\" (UniqueName: \"kubernetes.io/projected/a0c18d40-c03a-4c87-aa2c-ad743179dd6f-kube-api-access-lqmxf\") pod \"a0c18d40-c03a-4c87-aa2c-ad743179dd6f\" (UID: \"a0c18d40-c03a-4c87-aa2c-ad743179dd6f\") " Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.817958 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-845xd\" (UniqueName: \"kubernetes.io/projected/01d480ec-6f21-494a-b5b6-d58c1842077d-kube-api-access-845xd\") pod \"01d480ec-6f21-494a-b5b6-d58c1842077d\" (UID: \"01d480ec-6f21-494a-b5b6-d58c1842077d\") " Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.824059 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c18d40-c03a-4c87-aa2c-ad743179dd6f-kube-api-access-lqmxf" (OuterVolumeSpecName: "kube-api-access-lqmxf") pod "a0c18d40-c03a-4c87-aa2c-ad743179dd6f" (UID: "a0c18d40-c03a-4c87-aa2c-ad743179dd6f"). InnerVolumeSpecName "kube-api-access-lqmxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.826642 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d480ec-6f21-494a-b5b6-d58c1842077d-kube-api-access-845xd" (OuterVolumeSpecName: "kube-api-access-845xd") pod "01d480ec-6f21-494a-b5b6-d58c1842077d" (UID: "01d480ec-6f21-494a-b5b6-d58c1842077d"). InnerVolumeSpecName "kube-api-access-845xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.924540 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-845xd\" (UniqueName: \"kubernetes.io/projected/01d480ec-6f21-494a-b5b6-d58c1842077d-kube-api-access-845xd\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:01 crc kubenswrapper[4676]: I1204 15:38:01.924597 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqmxf\" (UniqueName: \"kubernetes.io/projected/a0c18d40-c03a-4c87-aa2c-ad743179dd6f-kube-api-access-lqmxf\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.002731 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zps4k"] Dec 04 15:38:02 crc kubenswrapper[4676]: W1204 15:38:02.003630 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42384168_5df1_4d2c_aec1_501e67ceb44e.slice/crio-291948e0e93419a7db0705bddd144e62b73cc6e0c05c94ff8d15897e813540cb WatchSource:0}: Error finding container 291948e0e93419a7db0705bddd144e62b73cc6e0c05c94ff8d15897e813540cb: Status 404 returned error can't find the container with id 291948e0e93419a7db0705bddd144e62b73cc6e0c05c94ff8d15897e813540cb Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.067296 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-wthdz" event={"ID":"a0c18d40-c03a-4c87-aa2c-ad743179dd6f","Type":"ContainerDied","Data":"6a9d44b7116b58a3b03690aeb5ea1eace5ee8a1081b78a597c9fba104c124e5c"} Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.067358 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-wthdz" Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.067385 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a9d44b7116b58a3b03690aeb5ea1eace5ee8a1081b78a597c9fba104c124e5c" Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.068418 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zps4k" event={"ID":"42384168-5df1-4d2c-aec1-501e67ceb44e","Type":"ContainerStarted","Data":"291948e0e93419a7db0705bddd144e62b73cc6e0c05c94ff8d15897e813540cb"} Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.069820 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ksj54" event={"ID":"e03c083b-3422-4f69-9355-7e8354125352","Type":"ContainerStarted","Data":"37996c0caddd00690195918e2e665067a56aa74dad777c4a536c0e097957b456"} Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.072052 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5bnn8" event={"ID":"01d480ec-6f21-494a-b5b6-d58c1842077d","Type":"ContainerDied","Data":"84c0cd85c5f7fc845f9138f9794ca5d650805f09d14bd8b932799727b8e61994"} Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.072105 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84c0cd85c5f7fc845f9138f9794ca5d650805f09d14bd8b932799727b8e61994" Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.072075 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5bnn8" Dec 04 15:38:02 crc kubenswrapper[4676]: I1204 15:38:02.095436 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-ksj54" podStartSLOduration=2.97661648 podStartE2EDuration="14.095411228s" podCreationTimestamp="2025-12-04 15:37:48 +0000 UTC" firstStartedPulling="2025-12-04 15:37:50.410963035 +0000 UTC m=+1077.845632892" lastFinishedPulling="2025-12-04 15:38:01.529757783 +0000 UTC m=+1088.964427640" observedRunningTime="2025-12-04 15:38:02.090378051 +0000 UTC m=+1089.525047908" watchObservedRunningTime="2025-12-04 15:38:02.095411228 +0000 UTC m=+1089.530081085" Dec 04 15:38:03 crc kubenswrapper[4676]: I1204 15:38:03.084704 4676 generic.go:334] "Generic (PLEG): container finished" podID="42384168-5df1-4d2c-aec1-501e67ceb44e" containerID="12339c749f3fd592625db4ac9a7ae46f8f0dfc6fd55f38fff1828475441daea4" exitCode=0 Dec 04 15:38:03 crc kubenswrapper[4676]: I1204 15:38:03.084771 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zps4k" event={"ID":"42384168-5df1-4d2c-aec1-501e67ceb44e","Type":"ContainerDied","Data":"12339c749f3fd592625db4ac9a7ae46f8f0dfc6fd55f38fff1828475441daea4"} Dec 04 15:38:04 crc kubenswrapper[4676]: I1204 15:38:04.098641 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerStarted","Data":"cbfeed3e81d3d27196bf6f56dd102c6ebc1dd0161c168a9e38bc71238441f064"} Dec 04 15:38:04 crc kubenswrapper[4676]: I1204 15:38:04.265263 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=11.575970295 podStartE2EDuration="51.265202599s" podCreationTimestamp="2025-12-04 15:37:13 +0000 UTC" firstStartedPulling="2025-12-04 15:37:24.166782686 +0000 UTC m=+1051.601452543" lastFinishedPulling="2025-12-04 15:38:03.85601499 +0000 UTC m=+1091.290684847" observedRunningTime="2025-12-04 15:38:04.26214931 +0000 UTC m=+1091.696819177" watchObservedRunningTime="2025-12-04 15:38:04.265202599 +0000 UTC m=+1091.699872456" Dec 04 15:38:04 crc kubenswrapper[4676]: I1204 15:38:04.558681 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zps4k" Dec 04 15:38:04 crc kubenswrapper[4676]: I1204 15:38:04.725078 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4nnw\" (UniqueName: \"kubernetes.io/projected/42384168-5df1-4d2c-aec1-501e67ceb44e-kube-api-access-p4nnw\") pod \"42384168-5df1-4d2c-aec1-501e67ceb44e\" (UID: \"42384168-5df1-4d2c-aec1-501e67ceb44e\") " Dec 04 15:38:04 crc kubenswrapper[4676]: I1204 15:38:04.731339 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42384168-5df1-4d2c-aec1-501e67ceb44e-kube-api-access-p4nnw" (OuterVolumeSpecName: "kube-api-access-p4nnw") pod "42384168-5df1-4d2c-aec1-501e67ceb44e" (UID: "42384168-5df1-4d2c-aec1-501e67ceb44e"). InnerVolumeSpecName "kube-api-access-p4nnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:04 crc kubenswrapper[4676]: I1204 15:38:04.827759 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4nnw\" (UniqueName: \"kubernetes.io/projected/42384168-5df1-4d2c-aec1-501e67ceb44e-kube-api-access-p4nnw\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:04 crc kubenswrapper[4676]: I1204 15:38:04.913158 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:05 crc kubenswrapper[4676]: I1204 15:38:05.108694 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zps4k" event={"ID":"42384168-5df1-4d2c-aec1-501e67ceb44e","Type":"ContainerDied","Data":"291948e0e93419a7db0705bddd144e62b73cc6e0c05c94ff8d15897e813540cb"} Dec 04 15:38:05 crc kubenswrapper[4676]: I1204 15:38:05.108754 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="291948e0e93419a7db0705bddd144e62b73cc6e0c05c94ff8d15897e813540cb" Dec 04 15:38:05 crc kubenswrapper[4676]: I1204 15:38:05.108711 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zps4k" Dec 04 15:38:08 crc kubenswrapper[4676]: I1204 15:38:08.156600 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:38:08 crc kubenswrapper[4676]: I1204 15:38:08.413524 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hdtnf" podUID="ce63098e-8737-4061-94ce-2b8c76ccb26f" containerName="ovn-controller" probeResult="failure" output=< Dec 04 15:38:08 crc kubenswrapper[4676]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 15:38:08 crc kubenswrapper[4676]: > Dec 04 15:38:09 crc kubenswrapper[4676]: I1204 15:38:09.149332 4676 generic.go:334] "Generic (PLEG): container finished" podID="a074e2a9-e6e9-488d-8338-54231ab8faf9" containerID="aab1f4365096fc9f95b98fb41f7d714cf599ff2efa4ba3bf021e19be29151223" exitCode=0 Dec 04 15:38:09 crc kubenswrapper[4676]: I1204 15:38:09.149417 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a074e2a9-e6e9-488d-8338-54231ab8faf9","Type":"ContainerDied","Data":"aab1f4365096fc9f95b98fb41f7d714cf599ff2efa4ba3bf021e19be29151223"} Dec 04 15:38:09 crc kubenswrapper[4676]: I1204 15:38:09.152621 4676 generic.go:334] "Generic (PLEG): container finished" podID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerID="1e79cadee4110746d5dcc8072fd80203a89b940c26619c6972fe68e00666b3ab" exitCode=0 Dec 04 15:38:09 crc kubenswrapper[4676]: I1204 15:38:09.152695 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bfec4df-7119-489c-a2e8-17dddd0e5c1d","Type":"ContainerDied","Data":"1e79cadee4110746d5dcc8072fd80203a89b940c26619c6972fe68e00666b3ab"} Dec 04 15:38:09 crc kubenswrapper[4676]: I1204 15:38:09.155107 4676 generic.go:334] "Generic (PLEG): container finished" podID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerID="a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c" exitCode=0 Dec 04 15:38:09 crc kubenswrapper[4676]: I1204 15:38:09.155152 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"743292d4-f5a5-48cd-bcb0-63fb95ac6910","Type":"ContainerDied","Data":"a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c"} Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.165165 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a074e2a9-e6e9-488d-8338-54231ab8faf9","Type":"ContainerStarted","Data":"d275fe2adb52dafe6bfcd1eba85a38466048f8b1a31439c2b1274fc6128068bd"} Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.165881 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.167468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bfec4df-7119-489c-a2e8-17dddd0e5c1d","Type":"ContainerStarted","Data":"dde28b06626f8149535cfc50ea66b8ee5915c6a25e62012e99bd3cb77d058d91"} Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.167982 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.169253 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"743292d4-f5a5-48cd-bcb0-63fb95ac6910","Type":"ContainerStarted","Data":"03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa"} Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.170104 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.194204 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=53.039212521 podStartE2EDuration="1m4.194185872s" podCreationTimestamp="2025-12-04 15:37:06 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.055131558 +0000 UTC m=+1050.489801425" lastFinishedPulling="2025-12-04 15:37:34.210104919 +0000 UTC m=+1061.644774776" observedRunningTime="2025-12-04 15:38:10.186860599 +0000 UTC m=+1097.621530476" watchObservedRunningTime="2025-12-04 15:38:10.194185872 +0000 UTC m=+1097.628855729" Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.217745 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=54.301202797 podStartE2EDuration="1m4.217720566s" podCreationTimestamp="2025-12-04 15:37:06 +0000 UTC" firstStartedPulling="2025-12-04 15:37:23.77525865 +0000 UTC m=+1051.209928507" lastFinishedPulling="2025-12-04 15:37:33.691776419 +0000 UTC m=+1061.126446276" observedRunningTime="2025-12-04 15:38:10.210474215 +0000 UTC m=+1097.645144092" watchObservedRunningTime="2025-12-04 15:38:10.217720566 +0000 UTC m=+1097.652390423" Dec 04 15:38:10 crc kubenswrapper[4676]: I1204 15:38:10.238906 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.317587742 podStartE2EDuration="1m4.23887508s" podCreationTimestamp="2025-12-04 15:37:06 +0000 UTC" firstStartedPulling="2025-12-04 15:37:24.192716869 +0000 UTC m=+1051.627386726" lastFinishedPulling="2025-12-04 15:37:34.114004207 +0000 UTC m=+1061.548674064" observedRunningTime="2025-12-04 15:38:10.233108573 +0000 UTC m=+1097.667778450" watchObservedRunningTime="2025-12-04 15:38:10.23887508 +0000 UTC m=+1097.673544937" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.178482 4676 generic.go:334] "Generic (PLEG): container finished" podID="e03c083b-3422-4f69-9355-7e8354125352" containerID="37996c0caddd00690195918e2e665067a56aa74dad777c4a536c0e097957b456" exitCode=0 Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.179572 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ksj54" event={"ID":"e03c083b-3422-4f69-9355-7e8354125352","Type":"ContainerDied","Data":"37996c0caddd00690195918e2e665067a56aa74dad777c4a536c0e097957b456"} Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.362695 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fa18-account-create-zzrzb"] Dec 04 15:38:11 crc kubenswrapper[4676]: E1204 15:38:11.363258 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c18d40-c03a-4c87-aa2c-ad743179dd6f" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.363298 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c18d40-c03a-4c87-aa2c-ad743179dd6f" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: E1204 15:38:11.363323 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d480ec-6f21-494a-b5b6-d58c1842077d" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.363331 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d480ec-6f21-494a-b5b6-d58c1842077d" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: E1204 15:38:11.363369 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42384168-5df1-4d2c-aec1-501e67ceb44e" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.363380 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="42384168-5df1-4d2c-aec1-501e67ceb44e" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.363612 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c18d40-c03a-4c87-aa2c-ad743179dd6f" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.363637 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="42384168-5df1-4d2c-aec1-501e67ceb44e" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.363650 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d480ec-6f21-494a-b5b6-d58c1842077d" containerName="mariadb-database-create" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.364622 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fa18-account-create-zzrzb" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.368270 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.375402 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fa18-account-create-zzrzb"] Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.478465 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b95t\" (UniqueName: \"kubernetes.io/projected/91e778f2-8276-4efa-b77c-ea0c86d5f5ff-kube-api-access-4b95t\") pod \"keystone-fa18-account-create-zzrzb\" (UID: \"91e778f2-8276-4efa-b77c-ea0c86d5f5ff\") " pod="openstack/keystone-fa18-account-create-zzrzb" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.553150 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-07ee-account-create-qb5s4"] Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.554455 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-07ee-account-create-qb5s4" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.568406 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.573187 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-07ee-account-create-qb5s4"] Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.579892 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b95t\" (UniqueName: \"kubernetes.io/projected/91e778f2-8276-4efa-b77c-ea0c86d5f5ff-kube-api-access-4b95t\") pod \"keystone-fa18-account-create-zzrzb\" (UID: \"91e778f2-8276-4efa-b77c-ea0c86d5f5ff\") " pod="openstack/keystone-fa18-account-create-zzrzb" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.608093 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b95t\" (UniqueName: \"kubernetes.io/projected/91e778f2-8276-4efa-b77c-ea0c86d5f5ff-kube-api-access-4b95t\") pod \"keystone-fa18-account-create-zzrzb\" (UID: \"91e778f2-8276-4efa-b77c-ea0c86d5f5ff\") " pod="openstack/keystone-fa18-account-create-zzrzb" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.682015 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxj4t\" (UniqueName: \"kubernetes.io/projected/e0ed69b4-f9ab-4a12-8bed-d6e639f518d1-kube-api-access-gxj4t\") pod \"placement-07ee-account-create-qb5s4\" (UID: \"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1\") " pod="openstack/placement-07ee-account-create-qb5s4" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.686336 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fa18-account-create-zzrzb" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.783555 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxj4t\" (UniqueName: \"kubernetes.io/projected/e0ed69b4-f9ab-4a12-8bed-d6e639f518d1-kube-api-access-gxj4t\") pod \"placement-07ee-account-create-qb5s4\" (UID: \"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1\") " pod="openstack/placement-07ee-account-create-qb5s4" Dec 04 15:38:11 crc kubenswrapper[4676]: I1204 15:38:11.892334 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxj4t\" (UniqueName: \"kubernetes.io/projected/e0ed69b4-f9ab-4a12-8bed-d6e639f518d1-kube-api-access-gxj4t\") pod \"placement-07ee-account-create-qb5s4\" (UID: \"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1\") " pod="openstack/placement-07ee-account-create-qb5s4" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.171207 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-07ee-account-create-qb5s4" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.175117 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fa18-account-create-zzrzb"] Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.190799 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fa18-account-create-zzrzb" event={"ID":"91e778f2-8276-4efa-b77c-ea0c86d5f5ff","Type":"ContainerStarted","Data":"9b0828aba6bdf86b1246412af673dc2b9843b507ac49aea54362cb668d0a7d85"} Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.476822 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.686421 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpsd7\" (UniqueName: \"kubernetes.io/projected/e03c083b-3422-4f69-9355-7e8354125352-kube-api-access-fpsd7\") pod \"e03c083b-3422-4f69-9355-7e8354125352\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.686476 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-ring-data-devices\") pod \"e03c083b-3422-4f69-9355-7e8354125352\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.686539 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-dispersionconf\") pod \"e03c083b-3422-4f69-9355-7e8354125352\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.686593 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-scripts\") pod \"e03c083b-3422-4f69-9355-7e8354125352\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.686629 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-swiftconf\") pod \"e03c083b-3422-4f69-9355-7e8354125352\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.686693 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-combined-ca-bundle\") pod \"e03c083b-3422-4f69-9355-7e8354125352\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.686735 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03c083b-3422-4f69-9355-7e8354125352-etc-swift\") pod \"e03c083b-3422-4f69-9355-7e8354125352\" (UID: \"e03c083b-3422-4f69-9355-7e8354125352\") " Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.687977 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e03c083b-3422-4f69-9355-7e8354125352-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e03c083b-3422-4f69-9355-7e8354125352" (UID: "e03c083b-3422-4f69-9355-7e8354125352"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.688534 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e03c083b-3422-4f69-9355-7e8354125352" (UID: "e03c083b-3422-4f69-9355-7e8354125352"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.700586 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e03c083b-3422-4f69-9355-7e8354125352-kube-api-access-fpsd7" (OuterVolumeSpecName: "kube-api-access-fpsd7") pod "e03c083b-3422-4f69-9355-7e8354125352" (UID: "e03c083b-3422-4f69-9355-7e8354125352"). InnerVolumeSpecName "kube-api-access-fpsd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.709070 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e03c083b-3422-4f69-9355-7e8354125352" (UID: "e03c083b-3422-4f69-9355-7e8354125352"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.724406 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e03c083b-3422-4f69-9355-7e8354125352" (UID: "e03c083b-3422-4f69-9355-7e8354125352"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.751527 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-scripts" (OuterVolumeSpecName: "scripts") pod "e03c083b-3422-4f69-9355-7e8354125352" (UID: "e03c083b-3422-4f69-9355-7e8354125352"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.765062 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e03c083b-3422-4f69-9355-7e8354125352" (UID: "e03c083b-3422-4f69-9355-7e8354125352"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.795020 4676 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.795291 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.795304 4676 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e03c083b-3422-4f69-9355-7e8354125352-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.795312 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpsd7\" (UniqueName: \"kubernetes.io/projected/e03c083b-3422-4f69-9355-7e8354125352-kube-api-access-fpsd7\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.795322 4676 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.795332 4676 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e03c083b-3422-4f69-9355-7e8354125352-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.795342 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e03c083b-3422-4f69-9355-7e8354125352-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:12 crc kubenswrapper[4676]: I1204 15:38:12.849286 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-07ee-account-create-qb5s4"] Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.212336 4676 generic.go:334] "Generic (PLEG): container finished" podID="91e778f2-8276-4efa-b77c-ea0c86d5f5ff" containerID="1625cfd497b9024c296cf4c1b522225d33c4e8616be121609fed2408b8d0a134" exitCode=0 Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.212442 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fa18-account-create-zzrzb" event={"ID":"91e778f2-8276-4efa-b77c-ea0c86d5f5ff","Type":"ContainerDied","Data":"1625cfd497b9024c296cf4c1b522225d33c4e8616be121609fed2408b8d0a134"} Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.217337 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-ksj54" event={"ID":"e03c083b-3422-4f69-9355-7e8354125352","Type":"ContainerDied","Data":"ca2b152fcc0f287aa97e8229e644734a9a49e933dfc2e00294366bff57111b41"} Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.217371 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca2b152fcc0f287aa97e8229e644734a9a49e933dfc2e00294366bff57111b41" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.217410 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ksj54" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.226912 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-07ee-account-create-qb5s4" event={"ID":"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1","Type":"ContainerStarted","Data":"4b5711510172d5ec812817348a57b7874b77a37a96dfbf1d4f1ab15887a7d7cd"} Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.226991 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-07ee-account-create-qb5s4" event={"ID":"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1","Type":"ContainerStarted","Data":"ebd6158dc314dae43c4dd3273c91796d594ce975de8c5a631252b205b646c4f8"} Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.268226 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8r4vm" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.289152 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-07ee-account-create-qb5s4" podStartSLOduration=2.289128084 podStartE2EDuration="2.289128084s" podCreationTimestamp="2025-12-04 15:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:38:13.281927315 +0000 UTC m=+1100.716597172" watchObservedRunningTime="2025-12-04 15:38:13.289128084 +0000 UTC m=+1100.723797941" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.417552 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-hdtnf" podUID="ce63098e-8737-4061-94ce-2b8c76ccb26f" containerName="ovn-controller" probeResult="failure" output=< Dec 04 15:38:13 crc kubenswrapper[4676]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 04 15:38:13 crc kubenswrapper[4676]: > Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.506241 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-7cc2-account-create-kcqmh"] Dec 04 15:38:13 crc kubenswrapper[4676]: E1204 15:38:13.507189 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e03c083b-3422-4f69-9355-7e8354125352" containerName="swift-ring-rebalance" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.507290 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e03c083b-3422-4f69-9355-7e8354125352" containerName="swift-ring-rebalance" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.507636 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e03c083b-3422-4f69-9355-7e8354125352" containerName="swift-ring-rebalance" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.508519 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7cc2-account-create-kcqmh" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.514380 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.530454 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-7cc2-account-create-kcqmh"] Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.556167 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-hdtnf-config-nvqg2"] Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.557570 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.563240 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.567732 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hdtnf-config-nvqg2"] Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.702876 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.703162 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpqvn\" (UniqueName: \"kubernetes.io/projected/bca609f9-fb1d-4be1-a208-d386b661cebf-kube-api-access-dpqvn\") pod \"watcher-7cc2-account-create-kcqmh\" (UID: \"bca609f9-fb1d-4be1-a208-d386b661cebf\") " pod="openstack/watcher-7cc2-account-create-kcqmh" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.703279 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-log-ovn\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.703365 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm6n2\" (UniqueName: \"kubernetes.io/projected/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-kube-api-access-tm6n2\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.703455 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-scripts\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.703591 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-additional-scripts\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.703750 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run-ovn\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.804977 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run-ovn\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805073 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805105 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpqvn\" (UniqueName: \"kubernetes.io/projected/bca609f9-fb1d-4be1-a208-d386b661cebf-kube-api-access-dpqvn\") pod \"watcher-7cc2-account-create-kcqmh\" (UID: \"bca609f9-fb1d-4be1-a208-d386b661cebf\") " pod="openstack/watcher-7cc2-account-create-kcqmh" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805147 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-log-ovn\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805173 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm6n2\" (UniqueName: \"kubernetes.io/projected/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-kube-api-access-tm6n2\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805206 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-scripts\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805275 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-additional-scripts\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805366 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run-ovn\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805449 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-log-ovn\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.805487 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.806150 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-additional-scripts\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.808200 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-scripts\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.831887 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm6n2\" (UniqueName: \"kubernetes.io/projected/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-kube-api-access-tm6n2\") pod \"ovn-controller-hdtnf-config-nvqg2\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.840992 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpqvn\" (UniqueName: \"kubernetes.io/projected/bca609f9-fb1d-4be1-a208-d386b661cebf-kube-api-access-dpqvn\") pod \"watcher-7cc2-account-create-kcqmh\" (UID: \"bca609f9-fb1d-4be1-a208-d386b661cebf\") " pod="openstack/watcher-7cc2-account-create-kcqmh" Dec 04 15:38:13 crc kubenswrapper[4676]: I1204 15:38:13.877243 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.127675 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7cc2-account-create-kcqmh" Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.362766 4676 generic.go:334] "Generic (PLEG): container finished" podID="e0ed69b4-f9ab-4a12-8bed-d6e639f518d1" containerID="4b5711510172d5ec812817348a57b7874b77a37a96dfbf1d4f1ab15887a7d7cd" exitCode=0 Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.363071 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-07ee-account-create-qb5s4" event={"ID":"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1","Type":"ContainerDied","Data":"4b5711510172d5ec812817348a57b7874b77a37a96dfbf1d4f1ab15887a7d7cd"} Dec 04 15:38:14 crc kubenswrapper[4676]: W1204 15:38:14.405921 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d0b56d1_5168_4a28_b75f_e9d7e339fa2b.slice/crio-d8167136f77472e7ff670ba8565a859204202c2815f452e3fcc479377bd52833 WatchSource:0}: Error finding container d8167136f77472e7ff670ba8565a859204202c2815f452e3fcc479377bd52833: Status 404 returned error can't find the container with id d8167136f77472e7ff670ba8565a859204202c2815f452e3fcc479377bd52833 Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.415885 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-hdtnf-config-nvqg2"] Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.507634 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-7cc2-account-create-kcqmh"] Dec 04 15:38:14 crc kubenswrapper[4676]: W1204 15:38:14.519751 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbca609f9_fb1d_4be1_a208_d386b661cebf.slice/crio-d183287a58909057a244e2e5209f1b831397a35ba9572a0e9b0cd8bfe80c6e6c WatchSource:0}: Error finding container d183287a58909057a244e2e5209f1b831397a35ba9572a0e9b0cd8bfe80c6e6c: Status 404 returned error can't find the container with id d183287a58909057a244e2e5209f1b831397a35ba9572a0e9b0cd8bfe80c6e6c Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.685546 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fa18-account-create-zzrzb" Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.749539 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b95t\" (UniqueName: \"kubernetes.io/projected/91e778f2-8276-4efa-b77c-ea0c86d5f5ff-kube-api-access-4b95t\") pod \"91e778f2-8276-4efa-b77c-ea0c86d5f5ff\" (UID: \"91e778f2-8276-4efa-b77c-ea0c86d5f5ff\") " Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.756104 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e778f2-8276-4efa-b77c-ea0c86d5f5ff-kube-api-access-4b95t" (OuterVolumeSpecName: "kube-api-access-4b95t") pod "91e778f2-8276-4efa-b77c-ea0c86d5f5ff" (UID: "91e778f2-8276-4efa-b77c-ea0c86d5f5ff"). InnerVolumeSpecName "kube-api-access-4b95t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.852053 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b95t\" (UniqueName: \"kubernetes.io/projected/91e778f2-8276-4efa-b77c-ea0c86d5f5ff-kube-api-access-4b95t\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.913102 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:14 crc kubenswrapper[4676]: I1204 15:38:14.915281 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.372681 4676 generic.go:334] "Generic (PLEG): container finished" podID="8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" containerID="2e36588d3aa3e3b96d231812753dcec1011d788732fec58db777c6c362982fec" exitCode=0 Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.372744 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hdtnf-config-nvqg2" event={"ID":"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b","Type":"ContainerDied","Data":"2e36588d3aa3e3b96d231812753dcec1011d788732fec58db777c6c362982fec"} Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.373037 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hdtnf-config-nvqg2" event={"ID":"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b","Type":"ContainerStarted","Data":"d8167136f77472e7ff670ba8565a859204202c2815f452e3fcc479377bd52833"} Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.374567 4676 generic.go:334] "Generic (PLEG): container finished" podID="bca609f9-fb1d-4be1-a208-d386b661cebf" containerID="49bb83efa5bc52af304067610f962d6f160148b43b60fa43c218dfab9fe9a3b7" exitCode=0 Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.374606 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-7cc2-account-create-kcqmh" event={"ID":"bca609f9-fb1d-4be1-a208-d386b661cebf","Type":"ContainerDied","Data":"49bb83efa5bc52af304067610f962d6f160148b43b60fa43c218dfab9fe9a3b7"} Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.374639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-7cc2-account-create-kcqmh" event={"ID":"bca609f9-fb1d-4be1-a208-d386b661cebf","Type":"ContainerStarted","Data":"d183287a58909057a244e2e5209f1b831397a35ba9572a0e9b0cd8bfe80c6e6c"} Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.376060 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fa18-account-create-zzrzb" event={"ID":"91e778f2-8276-4efa-b77c-ea0c86d5f5ff","Type":"ContainerDied","Data":"9b0828aba6bdf86b1246412af673dc2b9843b507ac49aea54362cb668d0a7d85"} Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.376086 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b0828aba6bdf86b1246412af673dc2b9843b507ac49aea54362cb668d0a7d85" Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.376105 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fa18-account-create-zzrzb" Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.377799 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:15 crc kubenswrapper[4676]: I1204 15:38:15.825426 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-07ee-account-create-qb5s4" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.004949 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxj4t\" (UniqueName: \"kubernetes.io/projected/e0ed69b4-f9ab-4a12-8bed-d6e639f518d1-kube-api-access-gxj4t\") pod \"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1\" (UID: \"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1\") " Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.010748 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0ed69b4-f9ab-4a12-8bed-d6e639f518d1-kube-api-access-gxj4t" (OuterVolumeSpecName: "kube-api-access-gxj4t") pod "e0ed69b4-f9ab-4a12-8bed-d6e639f518d1" (UID: "e0ed69b4-f9ab-4a12-8bed-d6e639f518d1"). InnerVolumeSpecName "kube-api-access-gxj4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.027303 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.027402 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.107372 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxj4t\" (UniqueName: \"kubernetes.io/projected/e0ed69b4-f9ab-4a12-8bed-d6e639f518d1-kube-api-access-gxj4t\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.385855 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-07ee-account-create-qb5s4" event={"ID":"e0ed69b4-f9ab-4a12-8bed-d6e639f518d1","Type":"ContainerDied","Data":"ebd6158dc314dae43c4dd3273c91796d594ce975de8c5a631252b205b646c4f8"} Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.386448 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebd6158dc314dae43c4dd3273c91796d594ce975de8c5a631252b205b646c4f8" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.386125 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-07ee-account-create-qb5s4" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.594234 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.599617 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/61ed17c4-ad81-4738-ac71-3b97f42d5211-etc-swift\") pod \"swift-storage-0\" (UID: \"61ed17c4-ad81-4738-ac71-3b97f42d5211\") " pod="openstack/swift-storage-0" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.795796 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.964957 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7cc2-account-create-kcqmh" Dec 04 15:38:16 crc kubenswrapper[4676]: I1204 15:38:16.972577 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.109825 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run\") pod \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.109936 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run-ovn\") pod \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.109979 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run" (OuterVolumeSpecName: "var-run") pod "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" (UID: "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110078 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" (UID: "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110110 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-additional-scripts\") pod \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110178 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpqvn\" (UniqueName: \"kubernetes.io/projected/bca609f9-fb1d-4be1-a208-d386b661cebf-kube-api-access-dpqvn\") pod \"bca609f9-fb1d-4be1-a208-d386b661cebf\" (UID: \"bca609f9-fb1d-4be1-a208-d386b661cebf\") " Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110250 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm6n2\" (UniqueName: \"kubernetes.io/projected/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-kube-api-access-tm6n2\") pod \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110304 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-scripts\") pod \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110342 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-log-ovn\") pod \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\" (UID: \"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b\") " Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110628 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" (UID: "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110977 4676 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.110995 4676 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.111004 4676 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.111307 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" (UID: "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.111378 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-scripts" (OuterVolumeSpecName: "scripts") pod "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" (UID: "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.116142 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca609f9-fb1d-4be1-a208-d386b661cebf-kube-api-access-dpqvn" (OuterVolumeSpecName: "kube-api-access-dpqvn") pod "bca609f9-fb1d-4be1-a208-d386b661cebf" (UID: "bca609f9-fb1d-4be1-a208-d386b661cebf"). InnerVolumeSpecName "kube-api-access-dpqvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.117815 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-kube-api-access-tm6n2" (OuterVolumeSpecName: "kube-api-access-tm6n2") pod "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" (UID: "8d0b56d1-5168-4a28-b75f-e9d7e339fa2b"). InnerVolumeSpecName "kube-api-access-tm6n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.212792 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.212840 4676 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.212853 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpqvn\" (UniqueName: \"kubernetes.io/projected/bca609f9-fb1d-4be1-a208-d386b661cebf-kube-api-access-dpqvn\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.212875 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm6n2\" (UniqueName: \"kubernetes.io/projected/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b-kube-api-access-tm6n2\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.397406 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-7cc2-account-create-kcqmh" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.397479 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-7cc2-account-create-kcqmh" event={"ID":"bca609f9-fb1d-4be1-a208-d386b661cebf","Type":"ContainerDied","Data":"d183287a58909057a244e2e5209f1b831397a35ba9572a0e9b0cd8bfe80c6e6c"} Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.397858 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d183287a58909057a244e2e5209f1b831397a35ba9572a0e9b0cd8bfe80c6e6c" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.400865 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-hdtnf-config-nvqg2" event={"ID":"8d0b56d1-5168-4a28-b75f-e9d7e339fa2b","Type":"ContainerDied","Data":"d8167136f77472e7ff670ba8565a859204202c2815f452e3fcc479377bd52833"} Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.400898 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8167136f77472e7ff670ba8565a859204202c2815f452e3fcc479377bd52833" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.401093 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-hdtnf-config-nvqg2" Dec 04 15:38:17 crc kubenswrapper[4676]: I1204 15:38:17.449476 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.166885 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-hdtnf-config-nvqg2"] Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.173561 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-hdtnf-config-nvqg2"] Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.397551 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-hdtnf" Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.411033 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"e1a43b67f11e49eb2637c9a376eaf9be434ddb7907a29098e355ec9f0f9ae2be"} Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.678527 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.678846 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="prometheus" containerID="cri-o://e0cb13f40dccb5fead31bf4eb65dffd748aa720a5ee83c48c9486a82f5dc88ce" gracePeriod=600 Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.678995 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="config-reloader" containerID="cri-o://792e5fc1ea1f84be22980db1caeb6f4cee61e88cb54981aea5250f563e98dd20" gracePeriod=600 Dec 04 15:38:18 crc kubenswrapper[4676]: I1204 15:38:18.678977 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="thanos-sidecar" containerID="cri-o://cbfeed3e81d3d27196bf6f56dd102c6ebc1dd0161c168a9e38bc71238441f064" gracePeriod=600 Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.460307 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" path="/var/lib/kubelet/pods/8d0b56d1-5168-4a28-b75f-e9d7e339fa2b/volumes" Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.469052 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"0350e4714bde08bc0970e3ceb4813344573b728ff6cb10c65d0bda346398defb"} Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.469102 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"bd49b6e138bbbbe98123979aa8a843ef525d47a3e7aa42e9d70f1ff599217018"} Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.469112 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"a11e3e0bff216a2d43a78063db7b1b87f8c5093feda3aca5593924841c733412"} Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.469121 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"7c1c82a4db586ff495cd624df068ef21f44054489a5e87f49891813c8873e0d5"} Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.493104 4676 generic.go:334] "Generic (PLEG): container finished" podID="c83d9914-203c-4a22-a92f-80851859fd48" containerID="cbfeed3e81d3d27196bf6f56dd102c6ebc1dd0161c168a9e38bc71238441f064" exitCode=0 Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.493141 4676 generic.go:334] "Generic (PLEG): container finished" podID="c83d9914-203c-4a22-a92f-80851859fd48" containerID="792e5fc1ea1f84be22980db1caeb6f4cee61e88cb54981aea5250f563e98dd20" exitCode=0 Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.493150 4676 generic.go:334] "Generic (PLEG): container finished" podID="c83d9914-203c-4a22-a92f-80851859fd48" containerID="e0cb13f40dccb5fead31bf4eb65dffd748aa720a5ee83c48c9486a82f5dc88ce" exitCode=0 Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.493174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerDied","Data":"cbfeed3e81d3d27196bf6f56dd102c6ebc1dd0161c168a9e38bc71238441f064"} Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.493203 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerDied","Data":"792e5fc1ea1f84be22980db1caeb6f4cee61e88cb54981aea5250f563e98dd20"} Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.493213 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerDied","Data":"e0cb13f40dccb5fead31bf4eb65dffd748aa720a5ee83c48c9486a82f5dc88ce"} Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.859408 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993129 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83d9914-203c-4a22-a92f-80851859fd48-config-out\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993213 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-thanos-prometheus-http-client-file\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993286 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-config\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993313 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llzrm\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-kube-api-access-llzrm\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993412 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83d9914-203c-4a22-a92f-80851859fd48-prometheus-metric-storage-rulefiles-0\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993435 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-tls-assets\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993576 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.993609 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-web-config\") pod \"c83d9914-203c-4a22-a92f-80851859fd48\" (UID: \"c83d9914-203c-4a22-a92f-80851859fd48\") " Dec 04 15:38:19 crc kubenswrapper[4676]: I1204 15:38:19.994512 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c83d9914-203c-4a22-a92f-80851859fd48-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.000118 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-config" (OuterVolumeSpecName: "config") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.000234 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.000499 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-kube-api-access-llzrm" (OuterVolumeSpecName: "kube-api-access-llzrm") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "kube-api-access-llzrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.011642 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c83d9914-203c-4a22-a92f-80851859fd48-config-out" (OuterVolumeSpecName: "config-out") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.018050 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.031009 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-web-config" (OuterVolumeSpecName: "web-config") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.050074 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "c83d9914-203c-4a22-a92f-80851859fd48" (UID: "c83d9914-203c-4a22-a92f-80851859fd48"). InnerVolumeSpecName "pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.095736 4676 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.096014 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.096132 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llzrm\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-kube-api-access-llzrm\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.096235 4676 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83d9914-203c-4a22-a92f-80851859fd48-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.096338 4676 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83d9914-203c-4a22-a92f-80851859fd48-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.096474 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") on node \"crc\" " Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.096568 4676 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83d9914-203c-4a22-a92f-80851859fd48-web-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.096686 4676 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83d9914-203c-4a22-a92f-80851859fd48-config-out\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.135530 4676 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.135703 4676 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515") on node "crc" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.197754 4676 reconciler_common.go:293] "Volume detached for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.726133 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"fc3bc9eec7c35674ff83dbabb9bd8119d2fa0d888513a4e3d2ba6ce150ff03f7"} Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.731506 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"c83d9914-203c-4a22-a92f-80851859fd48","Type":"ContainerDied","Data":"1658181f55868b57375875aef87b050de768632265a14fd0aee01662549f7375"} Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.731795 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.732262 4676 scope.go:117] "RemoveContainer" containerID="cbfeed3e81d3d27196bf6f56dd102c6ebc1dd0161c168a9e38bc71238441f064" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.754845 4676 scope.go:117] "RemoveContainer" containerID="792e5fc1ea1f84be22980db1caeb6f4cee61e88cb54981aea5250f563e98dd20" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.773590 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.780751 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.784419 4676 scope.go:117] "RemoveContainer" containerID="e0cb13f40dccb5fead31bf4eb65dffd748aa720a5ee83c48c9486a82f5dc88ce" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808408 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808774 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e778f2-8276-4efa-b77c-ea0c86d5f5ff" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808790 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e778f2-8276-4efa-b77c-ea0c86d5f5ff" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808798 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0ed69b4-f9ab-4a12-8bed-d6e639f518d1" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808804 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0ed69b4-f9ab-4a12-8bed-d6e639f518d1" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808819 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="thanos-sidecar" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808825 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="thanos-sidecar" Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808840 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="config-reloader" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808845 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="config-reloader" Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808870 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca609f9-fb1d-4be1-a208-d386b661cebf" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808876 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca609f9-fb1d-4be1-a208-d386b661cebf" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808888 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" containerName="ovn-config" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808896 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" containerName="ovn-config" Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808928 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="init-config-reloader" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808935 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="init-config-reloader" Dec 04 15:38:20 crc kubenswrapper[4676]: E1204 15:38:20.808947 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="prometheus" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.808953 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="prometheus" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.809146 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0ed69b4-f9ab-4a12-8bed-d6e639f518d1" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.809161 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="config-reloader" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.809170 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d0b56d1-5168-4a28-b75f-e9d7e339fa2b" containerName="ovn-config" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.809180 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca609f9-fb1d-4be1-a208-d386b661cebf" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.809192 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="prometheus" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.809201 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e778f2-8276-4efa-b77c-ea0c86d5f5ff" containerName="mariadb-account-create" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.809210 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83d9914-203c-4a22-a92f-80851859fd48" containerName="thanos-sidecar" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.811884 4676 scope.go:117] "RemoveContainer" containerID="7dcb20f95ba2c36d461fad9df709170e9632819ba97a81bff20c81a9af750c0b" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.815427 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.821508 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.821559 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-dsbwb" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.822164 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.822335 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.822851 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.822978 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.848409 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 04 15:38:20 crc kubenswrapper[4676]: I1204 15:38:20.862768 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.012573 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.012945 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.012982 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013049 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013129 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013166 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013204 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013306 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013414 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.013575 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scn6r\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-kube-api-access-scn6r\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.115803 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.115864 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scn6r\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-kube-api-access-scn6r\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.115962 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.115988 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116010 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116065 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116127 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116156 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116185 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116211 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116240 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.116749 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.123805 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.124020 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.132771 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.133035 4676 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.133088 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20a8147025daa03f462937d002ea44fbf472037636c1db1460079ca29c39445e/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.133571 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.134997 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.136076 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.139888 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.142972 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.149777 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scn6r\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-kube-api-access-scn6r\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.206113 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.398778 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83d9914-203c-4a22-a92f-80851859fd48" path="/var/lib/kubelet/pods/c83d9914-203c-4a22-a92f-80851859fd48/volumes" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.490065 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.751247 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"69fb83f4c61694c19cd451325c4cea29b5c9305534d976de0655901d2e63f24e"} Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.751608 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"d5c7aad30181a387759645e1d9f0cd3bad5d2f3653772c3b53ccb3e638d48439"} Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.751624 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"8a67a0e39ce6c4e16903e2e6b8cfdfef317f9d1db90fae111d3cb2a148834062"} Dec 04 15:38:21 crc kubenswrapper[4676]: I1204 15:38:21.998121 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 15:38:22 crc kubenswrapper[4676]: I1204 15:38:22.761034 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerStarted","Data":"3912ccd57b14bef539a7dcb43fff8badc45eb6dbb5e7b86d3a062206bf974983"} Dec 04 15:38:22 crc kubenswrapper[4676]: I1204 15:38:22.768200 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"af8088034cf41934ded2262c8cc16e0c5aea96a684327e20a15198f3ec8ce85d"} Dec 04 15:38:22 crc kubenswrapper[4676]: I1204 15:38:22.768490 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"f36c21b41fbfddd2f0fa55467dc8752bf422cce59a8ebdb36a8a49a05ecf38d6"} Dec 04 15:38:22 crc kubenswrapper[4676]: I1204 15:38:22.768501 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"cce63cac117cb339d288a92c58b50415d06cb4b258a59321280e63e888a945f7"} Dec 04 15:38:23 crc kubenswrapper[4676]: I1204 15:38:23.807668 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"03686d6ccadf119aa0d0ae9759055e3bc42777cc93768797574f2ab9a8b64f43"} Dec 04 15:38:23 crc kubenswrapper[4676]: I1204 15:38:23.808234 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"8ed7b7d5010ee649f3ea6aa7ae6fed3abb2d654a20eb5b81663d20062a600ec1"} Dec 04 15:38:23 crc kubenswrapper[4676]: I1204 15:38:23.808248 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"c96b60a30762c0f7d9d414ad4137512dcf2171f93fc4b5b1fa1d110e47d39048"} Dec 04 15:38:23 crc kubenswrapper[4676]: I1204 15:38:23.808261 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"61ed17c4-ad81-4738-ac71-3b97f42d5211","Type":"ContainerStarted","Data":"3821dc3ca03c90932fc930ce755505242b3d5901b19ec6ec322a758dc24859e0"} Dec 04 15:38:23 crc kubenswrapper[4676]: I1204 15:38:23.851698 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.072743323 podStartE2EDuration="40.851675293s" podCreationTimestamp="2025-12-04 15:37:43 +0000 UTC" firstStartedPulling="2025-12-04 15:38:17.461225451 +0000 UTC m=+1104.895895308" lastFinishedPulling="2025-12-04 15:38:22.240157421 +0000 UTC m=+1109.674827278" observedRunningTime="2025-12-04 15:38:23.837429709 +0000 UTC m=+1111.272099566" watchObservedRunningTime="2025-12-04 15:38:23.851675293 +0000 UTC m=+1111.286345150" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.093702 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-779f74f7bf-7rrdz"] Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.104368 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.112474 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.136241 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779f74f7bf-7rrdz"] Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.266538 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28lpk\" (UniqueName: \"kubernetes.io/projected/99411ac6-aa35-4f96-bf75-783e3dcbdf93-kube-api-access-28lpk\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.266768 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-sb\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.266834 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-svc\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.266994 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-swift-storage-0\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.267348 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-config\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.267393 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-nb\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.369188 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-swift-storage-0\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.369315 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-config\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.369338 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-nb\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.369372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28lpk\" (UniqueName: \"kubernetes.io/projected/99411ac6-aa35-4f96-bf75-783e3dcbdf93-kube-api-access-28lpk\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.369413 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-sb\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.369456 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-svc\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.370543 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-svc\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.370645 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-swift-storage-0\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.370878 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-config\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.371492 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-sb\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.371585 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-nb\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.389673 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28lpk\" (UniqueName: \"kubernetes.io/projected/99411ac6-aa35-4f96-bf75-783e3dcbdf93-kube-api-access-28lpk\") pod \"dnsmasq-dns-779f74f7bf-7rrdz\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.439494 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.768224 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-779f74f7bf-7rrdz"] Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.822358 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" event={"ID":"99411ac6-aa35-4f96-bf75-783e3dcbdf93","Type":"ContainerStarted","Data":"8c87540b5749f74f14853c9c9901bec81d09a866b8a7ae7fd59f0e6724cbb36a"} Dec 04 15:38:24 crc kubenswrapper[4676]: I1204 15:38:24.826025 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerStarted","Data":"3208384ccdbd564f3e354d6c6164a6970015b7dc9c6d09643c22c9f914c60fe8"} Dec 04 15:38:25 crc kubenswrapper[4676]: I1204 15:38:25.832713 4676 generic.go:334] "Generic (PLEG): container finished" podID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerID="fbac04c0072863afb5ca283bcd18ae655a3c53349c79e66182f093463d9d5596" exitCode=0 Dec 04 15:38:25 crc kubenswrapper[4676]: I1204 15:38:25.832803 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" event={"ID":"99411ac6-aa35-4f96-bf75-783e3dcbdf93","Type":"ContainerDied","Data":"fbac04c0072863afb5ca283bcd18ae655a3c53349c79e66182f093463d9d5596"} Dec 04 15:38:26 crc kubenswrapper[4676]: I1204 15:38:26.843739 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" event={"ID":"99411ac6-aa35-4f96-bf75-783e3dcbdf93","Type":"ContainerStarted","Data":"e13d9c782e905b3a608022c2a1f041ad10314ffb20c2253f1309fead73947429"} Dec 04 15:38:26 crc kubenswrapper[4676]: I1204 15:38:26.844177 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:26 crc kubenswrapper[4676]: I1204 15:38:26.860741 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" podStartSLOduration=2.8607195069999998 podStartE2EDuration="2.860719507s" podCreationTimestamp="2025-12-04 15:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:38:26.860086298 +0000 UTC m=+1114.294756175" watchObservedRunningTime="2025-12-04 15:38:26.860719507 +0000 UTC m=+1114.295389364" Dec 04 15:38:27 crc kubenswrapper[4676]: I1204 15:38:27.677701 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Dec 04 15:38:27 crc kubenswrapper[4676]: I1204 15:38:27.974759 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Dec 04 15:38:28 crc kubenswrapper[4676]: I1204 15:38:28.276547 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="a074e2a9-e6e9-488d-8338-54231ab8faf9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Dec 04 15:38:31 crc kubenswrapper[4676]: I1204 15:38:31.904254 4676 generic.go:334] "Generic (PLEG): container finished" podID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerID="3208384ccdbd564f3e354d6c6164a6970015b7dc9c6d09643c22c9f914c60fe8" exitCode=0 Dec 04 15:38:31 crc kubenswrapper[4676]: I1204 15:38:31.904350 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerDied","Data":"3208384ccdbd564f3e354d6c6164a6970015b7dc9c6d09643c22c9f914c60fe8"} Dec 04 15:38:32 crc kubenswrapper[4676]: I1204 15:38:32.917994 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerStarted","Data":"991a9a33e94f251f9ea53fb45351db6e413bd376ad4e2f9824b66c19f1bf3920"} Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.441213 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.522675 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c4f5b9f5-bvf7v"] Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.522963 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" podUID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerName="dnsmasq-dns" containerID="cri-o://9635365c5da448fbfb1e015d65b56f91ae595b8cbdb34438229c451f1a235dbd" gracePeriod=10 Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.946748 4676 generic.go:334] "Generic (PLEG): container finished" podID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerID="9635365c5da448fbfb1e015d65b56f91ae595b8cbdb34438229c451f1a235dbd" exitCode=0 Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.948021 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" event={"ID":"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13","Type":"ContainerDied","Data":"9635365c5da448fbfb1e015d65b56f91ae595b8cbdb34438229c451f1a235dbd"} Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.955256 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerStarted","Data":"a91e1a0554e9ce02fae7e7976179b0a8d76dbbdd0d92ebe87923262e15de4c5a"} Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.955299 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerStarted","Data":"23cf260c28249d11f44440cb43d81694bb3dfa95a535fac9169e3ef103394bce"} Dec 04 15:38:34 crc kubenswrapper[4676]: I1204 15:38:34.983199 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.98317825 podStartE2EDuration="14.98317825s" podCreationTimestamp="2025-12-04 15:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:38:34.978795303 +0000 UTC m=+1122.413465180" watchObservedRunningTime="2025-12-04 15:38:34.98317825 +0000 UTC m=+1122.417848107" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.023675 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.152066 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-config\") pod \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.152429 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-sb\") pod \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.152622 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-dns-svc\") pod \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.153670 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-nb\") pod \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.153792 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j9rk\" (UniqueName: \"kubernetes.io/projected/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-kube-api-access-5j9rk\") pod \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\" (UID: \"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13\") " Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.157586 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-kube-api-access-5j9rk" (OuterVolumeSpecName: "kube-api-access-5j9rk") pod "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" (UID: "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13"). InnerVolumeSpecName "kube-api-access-5j9rk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.199428 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" (UID: "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.205497 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" (UID: "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.207841 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-config" (OuterVolumeSpecName: "config") pod "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" (UID: "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.208282 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" (UID: "f9bc34e2-332b-4bb5-bb8f-dc5e3992be13"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.256547 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.256585 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.256599 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.256609 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.256621 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j9rk\" (UniqueName: \"kubernetes.io/projected/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13-kube-api-access-5j9rk\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.964810 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.965692 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c4f5b9f5-bvf7v" event={"ID":"f9bc34e2-332b-4bb5-bb8f-dc5e3992be13","Type":"ContainerDied","Data":"74e996b868cd08e29ec9973f8980ed0733fcb88cf705ef1a58d5367a7e17f63f"} Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.965733 4676 scope.go:117] "RemoveContainer" containerID="9635365c5da448fbfb1e015d65b56f91ae595b8cbdb34438229c451f1a235dbd" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.992782 4676 scope.go:117] "RemoveContainer" containerID="ea3be10bae902b06928a512ae90832e1a78f9cca3811a12c5659ffa5d80f6c65" Dec 04 15:38:35 crc kubenswrapper[4676]: I1204 15:38:35.995826 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c4f5b9f5-bvf7v"] Dec 04 15:38:36 crc kubenswrapper[4676]: I1204 15:38:36.009411 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c4f5b9f5-bvf7v"] Dec 04 15:38:36 crc kubenswrapper[4676]: I1204 15:38:36.490602 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:36 crc kubenswrapper[4676]: I1204 15:38:36.490653 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:36 crc kubenswrapper[4676]: I1204 15:38:36.497519 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:36 crc kubenswrapper[4676]: I1204 15:38:36.978718 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 04 15:38:37 crc kubenswrapper[4676]: I1204 15:38:37.393239 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" path="/var/lib/kubelet/pods/f9bc34e2-332b-4bb5-bb8f-dc5e3992be13/volumes" Dec 04 15:38:37 crc kubenswrapper[4676]: I1204 15:38:37.678161 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 15:38:37 crc kubenswrapper[4676]: I1204 15:38:37.975296 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.267693 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-m4p7c"] Dec 04 15:38:38 crc kubenswrapper[4676]: E1204 15:38:38.268287 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerName="init" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.268309 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerName="init" Dec 04 15:38:38 crc kubenswrapper[4676]: E1204 15:38:38.268325 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerName="dnsmasq-dns" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.268332 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerName="dnsmasq-dns" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.268592 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bc34e2-332b-4bb5-bb8f-dc5e3992be13" containerName="dnsmasq-dns" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.269473 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4p7c" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.278304 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.284248 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m4p7c"] Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.334605 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtncs\" (UniqueName: \"kubernetes.io/projected/504e890d-08fd-41c1-b1cd-f0a9480e17df-kube-api-access-mtncs\") pod \"cinder-db-create-m4p7c\" (UID: \"504e890d-08fd-41c1-b1cd-f0a9480e17df\") " pod="openstack/cinder-db-create-m4p7c" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.381651 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-vwnjh"] Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.383356 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vwnjh" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.401503 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vwnjh"] Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.437007 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtncs\" (UniqueName: \"kubernetes.io/projected/504e890d-08fd-41c1-b1cd-f0a9480e17df-kube-api-access-mtncs\") pod \"cinder-db-create-m4p7c\" (UID: \"504e890d-08fd-41c1-b1cd-f0a9480e17df\") " pod="openstack/cinder-db-create-m4p7c" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.437080 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ljcj\" (UniqueName: \"kubernetes.io/projected/3fec9aa8-63ba-40bb-9217-590ae458da93-kube-api-access-5ljcj\") pod \"barbican-db-create-vwnjh\" (UID: \"3fec9aa8-63ba-40bb-9217-590ae458da93\") " pod="openstack/barbican-db-create-vwnjh" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.461809 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtncs\" (UniqueName: \"kubernetes.io/projected/504e890d-08fd-41c1-b1cd-f0a9480e17df-kube-api-access-mtncs\") pod \"cinder-db-create-m4p7c\" (UID: \"504e890d-08fd-41c1-b1cd-f0a9480e17df\") " pod="openstack/cinder-db-create-m4p7c" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.539572 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ljcj\" (UniqueName: \"kubernetes.io/projected/3fec9aa8-63ba-40bb-9217-590ae458da93-kube-api-access-5ljcj\") pod \"barbican-db-create-vwnjh\" (UID: \"3fec9aa8-63ba-40bb-9217-590ae458da93\") " pod="openstack/barbican-db-create-vwnjh" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.582644 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ljcj\" (UniqueName: \"kubernetes.io/projected/3fec9aa8-63ba-40bb-9217-590ae458da93-kube-api-access-5ljcj\") pod \"barbican-db-create-vwnjh\" (UID: \"3fec9aa8-63ba-40bb-9217-590ae458da93\") " pod="openstack/barbican-db-create-vwnjh" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.597842 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4p7c" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.710738 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-xhjnc"] Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.718860 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vwnjh" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.721091 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.725427 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.725722 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.726359 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vsxn6" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.726794 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xhjnc"] Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.732901 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.970048 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8sr\" (UniqueName: \"kubernetes.io/projected/12cc7c9f-d211-490e-b297-0a250646e111-kube-api-access-9v8sr\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.970523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-config-data\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:38 crc kubenswrapper[4676]: I1204 15:38:38.970585 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-combined-ca-bundle\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.076023 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-config-data\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.076085 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-combined-ca-bundle\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.076126 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8sr\" (UniqueName: \"kubernetes.io/projected/12cc7c9f-d211-490e-b297-0a250646e111-kube-api-access-9v8sr\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.090525 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-combined-ca-bundle\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.090568 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-config-data\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.113766 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8sr\" (UniqueName: \"kubernetes.io/projected/12cc7c9f-d211-490e-b297-0a250646e111-kube-api-access-9v8sr\") pod \"keystone-db-sync-xhjnc\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.245948 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-m4p7c"] Dec 04 15:38:39 crc kubenswrapper[4676]: W1204 15:38:39.251848 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod504e890d_08fd_41c1_b1cd_f0a9480e17df.slice/crio-0a213880abdfcd1ca9a1a25c6c32653f144cff5ebd55403ae35656f7620d5ab9 WatchSource:0}: Error finding container 0a213880abdfcd1ca9a1a25c6c32653f144cff5ebd55403ae35656f7620d5ab9: Status 404 returned error can't find the container with id 0a213880abdfcd1ca9a1a25c6c32653f144cff5ebd55403ae35656f7620d5ab9 Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.375549 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.577844 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-vwnjh"] Dec 04 15:38:39 crc kubenswrapper[4676]: W1204 15:38:39.591478 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fec9aa8_63ba_40bb_9217_590ae458da93.slice/crio-f372ee8471c63c237ebfb9abfc49af16cd26489d7ada8779052a0329f7cdd0a0 WatchSource:0}: Error finding container f372ee8471c63c237ebfb9abfc49af16cd26489d7ada8779052a0329f7cdd0a0: Status 404 returned error can't find the container with id f372ee8471c63c237ebfb9abfc49af16cd26489d7ada8779052a0329f7cdd0a0 Dec 04 15:38:39 crc kubenswrapper[4676]: I1204 15:38:39.713221 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-xhjnc"] Dec 04 15:38:39 crc kubenswrapper[4676]: W1204 15:38:39.767907 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12cc7c9f_d211_490e_b297_0a250646e111.slice/crio-eb057a60ec84d7523c968d44528945f274eb02daaa74abb14d329445ed36520f WatchSource:0}: Error finding container eb057a60ec84d7523c968d44528945f274eb02daaa74abb14d329445ed36520f: Status 404 returned error can't find the container with id eb057a60ec84d7523c968d44528945f274eb02daaa74abb14d329445ed36520f Dec 04 15:38:40 crc kubenswrapper[4676]: I1204 15:38:40.084732 4676 generic.go:334] "Generic (PLEG): container finished" podID="3fec9aa8-63ba-40bb-9217-590ae458da93" containerID="34de63e20dff3df88f43f5c080c02447ca98045b330017a58da2a557c7a04fa8" exitCode=0 Dec 04 15:38:40 crc kubenswrapper[4676]: I1204 15:38:40.084849 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vwnjh" event={"ID":"3fec9aa8-63ba-40bb-9217-590ae458da93","Type":"ContainerDied","Data":"34de63e20dff3df88f43f5c080c02447ca98045b330017a58da2a557c7a04fa8"} Dec 04 15:38:40 crc kubenswrapper[4676]: I1204 15:38:40.085082 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vwnjh" event={"ID":"3fec9aa8-63ba-40bb-9217-590ae458da93","Type":"ContainerStarted","Data":"f372ee8471c63c237ebfb9abfc49af16cd26489d7ada8779052a0329f7cdd0a0"} Dec 04 15:38:40 crc kubenswrapper[4676]: I1204 15:38:40.087022 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xhjnc" event={"ID":"12cc7c9f-d211-490e-b297-0a250646e111","Type":"ContainerStarted","Data":"eb057a60ec84d7523c968d44528945f274eb02daaa74abb14d329445ed36520f"} Dec 04 15:38:40 crc kubenswrapper[4676]: I1204 15:38:40.093492 4676 generic.go:334] "Generic (PLEG): container finished" podID="504e890d-08fd-41c1-b1cd-f0a9480e17df" containerID="6fac95b64599a521bfb8281e0706e50dbde4706b3fde5b39166b44c0178d6204" exitCode=0 Dec 04 15:38:40 crc kubenswrapper[4676]: I1204 15:38:40.093533 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m4p7c" event={"ID":"504e890d-08fd-41c1-b1cd-f0a9480e17df","Type":"ContainerDied","Data":"6fac95b64599a521bfb8281e0706e50dbde4706b3fde5b39166b44c0178d6204"} Dec 04 15:38:40 crc kubenswrapper[4676]: I1204 15:38:40.093557 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m4p7c" event={"ID":"504e890d-08fd-41c1-b1cd-f0a9480e17df","Type":"ContainerStarted","Data":"0a213880abdfcd1ca9a1a25c6c32653f144cff5ebd55403ae35656f7620d5ab9"} Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.415500 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lxbp8"] Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.424880 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lxbp8"] Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.425046 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lxbp8" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.542509 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcllq\" (UniqueName: \"kubernetes.io/projected/20d4d2e0-ea26-476f-b7e6-fd922c301ba0-kube-api-access-fcllq\") pod \"glance-db-create-lxbp8\" (UID: \"20d4d2e0-ea26-476f-b7e6-fd922c301ba0\") " pod="openstack/glance-db-create-lxbp8" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.548001 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-cmrp2"] Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.562969 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.568354 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-cmrp2"] Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.570308 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-h2jgj" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.576696 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.590544 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-gh7lx"] Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.592778 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gh7lx" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.645105 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcllq\" (UniqueName: \"kubernetes.io/projected/20d4d2e0-ea26-476f-b7e6-fd922c301ba0-kube-api-access-fcllq\") pod \"glance-db-create-lxbp8\" (UID: \"20d4d2e0-ea26-476f-b7e6-fd922c301ba0\") " pod="openstack/glance-db-create-lxbp8" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.646320 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gh7lx"] Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.683039 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcllq\" (UniqueName: \"kubernetes.io/projected/20d4d2e0-ea26-476f-b7e6-fd922c301ba0-kube-api-access-fcllq\") pod \"glance-db-create-lxbp8\" (UID: \"20d4d2e0-ea26-476f-b7e6-fd922c301ba0\") " pod="openstack/glance-db-create-lxbp8" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.748814 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xbm7\" (UniqueName: \"kubernetes.io/projected/063e66f9-8c76-4a2c-9392-f35b247d1304-kube-api-access-9xbm7\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.748859 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-combined-ca-bundle\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.748907 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-config-data\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.748990 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-db-sync-config-data\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.749015 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr4w6\" (UniqueName: \"kubernetes.io/projected/85e23715-9b6f-4307-97f5-36289341911d-kube-api-access-gr4w6\") pod \"neutron-db-create-gh7lx\" (UID: \"85e23715-9b6f-4307-97f5-36289341911d\") " pod="openstack/neutron-db-create-gh7lx" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.760185 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4p7c" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.783101 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lxbp8" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.792631 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vwnjh" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.850382 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xbm7\" (UniqueName: \"kubernetes.io/projected/063e66f9-8c76-4a2c-9392-f35b247d1304-kube-api-access-9xbm7\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.851148 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-combined-ca-bundle\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.851322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-config-data\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.851528 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-db-sync-config-data\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.851660 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr4w6\" (UniqueName: \"kubernetes.io/projected/85e23715-9b6f-4307-97f5-36289341911d-kube-api-access-gr4w6\") pod \"neutron-db-create-gh7lx\" (UID: \"85e23715-9b6f-4307-97f5-36289341911d\") " pod="openstack/neutron-db-create-gh7lx" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.861397 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-db-sync-config-data\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.869781 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-combined-ca-bundle\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.875068 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-config-data\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.881413 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr4w6\" (UniqueName: \"kubernetes.io/projected/85e23715-9b6f-4307-97f5-36289341911d-kube-api-access-gr4w6\") pod \"neutron-db-create-gh7lx\" (UID: \"85e23715-9b6f-4307-97f5-36289341911d\") " pod="openstack/neutron-db-create-gh7lx" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.882674 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xbm7\" (UniqueName: \"kubernetes.io/projected/063e66f9-8c76-4a2c-9392-f35b247d1304-kube-api-access-9xbm7\") pod \"watcher-db-sync-cmrp2\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.958037 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtncs\" (UniqueName: \"kubernetes.io/projected/504e890d-08fd-41c1-b1cd-f0a9480e17df-kube-api-access-mtncs\") pod \"504e890d-08fd-41c1-b1cd-f0a9480e17df\" (UID: \"504e890d-08fd-41c1-b1cd-f0a9480e17df\") " Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.958124 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ljcj\" (UniqueName: \"kubernetes.io/projected/3fec9aa8-63ba-40bb-9217-590ae458da93-kube-api-access-5ljcj\") pod \"3fec9aa8-63ba-40bb-9217-590ae458da93\" (UID: \"3fec9aa8-63ba-40bb-9217-590ae458da93\") " Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.967144 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/504e890d-08fd-41c1-b1cd-f0a9480e17df-kube-api-access-mtncs" (OuterVolumeSpecName: "kube-api-access-mtncs") pod "504e890d-08fd-41c1-b1cd-f0a9480e17df" (UID: "504e890d-08fd-41c1-b1cd-f0a9480e17df"). InnerVolumeSpecName "kube-api-access-mtncs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:41 crc kubenswrapper[4676]: I1204 15:38:41.973158 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fec9aa8-63ba-40bb-9217-590ae458da93-kube-api-access-5ljcj" (OuterVolumeSpecName: "kube-api-access-5ljcj") pod "3fec9aa8-63ba-40bb-9217-590ae458da93" (UID: "3fec9aa8-63ba-40bb-9217-590ae458da93"). InnerVolumeSpecName "kube-api-access-5ljcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.058615 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.060656 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtncs\" (UniqueName: \"kubernetes.io/projected/504e890d-08fd-41c1-b1cd-f0a9480e17df-kube-api-access-mtncs\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.060707 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ljcj\" (UniqueName: \"kubernetes.io/projected/3fec9aa8-63ba-40bb-9217-590ae458da93-kube-api-access-5ljcj\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.077553 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gh7lx" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.127857 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-m4p7c" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.127835 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-m4p7c" event={"ID":"504e890d-08fd-41c1-b1cd-f0a9480e17df","Type":"ContainerDied","Data":"0a213880abdfcd1ca9a1a25c6c32653f144cff5ebd55403ae35656f7620d5ab9"} Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.127983 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a213880abdfcd1ca9a1a25c6c32653f144cff5ebd55403ae35656f7620d5ab9" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.131726 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-vwnjh" event={"ID":"3fec9aa8-63ba-40bb-9217-590ae458da93","Type":"ContainerDied","Data":"f372ee8471c63c237ebfb9abfc49af16cd26489d7ada8779052a0329f7cdd0a0"} Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.131786 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f372ee8471c63c237ebfb9abfc49af16cd26489d7ada8779052a0329f7cdd0a0" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.131855 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-vwnjh" Dec 04 15:38:42 crc kubenswrapper[4676]: I1204 15:38:42.409527 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lxbp8"] Dec 04 15:38:45 crc kubenswrapper[4676]: W1204 15:38:45.922983 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20d4d2e0_ea26_476f_b7e6_fd922c301ba0.slice/crio-f02bfd09791951221107e4e3d96978561aedec897b0ce0049cccbc34755174b7 WatchSource:0}: Error finding container f02bfd09791951221107e4e3d96978561aedec897b0ce0049cccbc34755174b7: Status 404 returned error can't find the container with id f02bfd09791951221107e4e3d96978561aedec897b0ce0049cccbc34755174b7 Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.027087 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.027396 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.027451 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.028488 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47374e6ac332c7bd6c641b2efeca6385b181e71dff18cb42d3770eabc6e1122b"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.028548 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://47374e6ac332c7bd6c641b2efeca6385b181e71dff18cb42d3770eabc6e1122b" gracePeriod=600 Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.182146 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lxbp8" event={"ID":"20d4d2e0-ea26-476f-b7e6-fd922c301ba0","Type":"ContainerStarted","Data":"f02bfd09791951221107e4e3d96978561aedec897b0ce0049cccbc34755174b7"} Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.185748 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="47374e6ac332c7bd6c641b2efeca6385b181e71dff18cb42d3770eabc6e1122b" exitCode=0 Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.185809 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"47374e6ac332c7bd6c641b2efeca6385b181e71dff18cb42d3770eabc6e1122b"} Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.185887 4676 scope.go:117] "RemoveContainer" containerID="d4e59e979cd83496088e0b3d97a0d08e9a57942e7fa37137c26486dd40de7195" Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.411197 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-cmrp2"] Dec 04 15:38:46 crc kubenswrapper[4676]: I1204 15:38:46.451162 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-gh7lx"] Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.194553 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-cmrp2" event={"ID":"063e66f9-8c76-4a2c-9392-f35b247d1304","Type":"ContainerStarted","Data":"4dab522ba4eb914eb2a74446af7b6a8ef7e03a9a21141190205b0f092c35dcf4"} Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.199414 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gh7lx" event={"ID":"85e23715-9b6f-4307-97f5-36289341911d","Type":"ContainerStarted","Data":"3fcb2c69f4e90e86f37965b7715aa2fd4009c24cf1d8665b16c457c6c0eff841"} Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.199454 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gh7lx" event={"ID":"85e23715-9b6f-4307-97f5-36289341911d","Type":"ContainerStarted","Data":"ffe71c1ccebcfba5db37eb19fe04b5e829f9faaa32826121e8c9e79401e63d6f"} Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.202696 4676 generic.go:334] "Generic (PLEG): container finished" podID="20d4d2e0-ea26-476f-b7e6-fd922c301ba0" containerID="24eaa47e6dfb02dd8e0da3a9bc69fa571bd81f3bbc2f5185e5940f61761077a9" exitCode=0 Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.202743 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lxbp8" event={"ID":"20d4d2e0-ea26-476f-b7e6-fd922c301ba0","Type":"ContainerDied","Data":"24eaa47e6dfb02dd8e0da3a9bc69fa571bd81f3bbc2f5185e5940f61761077a9"} Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.204633 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"4ed31aaa37dc8e9548191807986356b721b0f7ff822299d24779fcd58f9d4ea2"} Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.206891 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xhjnc" event={"ID":"12cc7c9f-d211-490e-b297-0a250646e111","Type":"ContainerStarted","Data":"bc4865c331287eaeefe44663a1a8a1cf9db6740287d27940ab743e1f0e51b2b3"} Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.233656 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-gh7lx" podStartSLOduration=6.233618869 podStartE2EDuration="6.233618869s" podCreationTimestamp="2025-12-04 15:38:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:38:47.226301266 +0000 UTC m=+1134.660971133" watchObservedRunningTime="2025-12-04 15:38:47.233618869 +0000 UTC m=+1134.668288726" Dec 04 15:38:47 crc kubenswrapper[4676]: I1204 15:38:47.286329 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-xhjnc" podStartSLOduration=2.718226548 podStartE2EDuration="9.28631107s" podCreationTimestamp="2025-12-04 15:38:38 +0000 UTC" firstStartedPulling="2025-12-04 15:38:39.770633216 +0000 UTC m=+1127.205303073" lastFinishedPulling="2025-12-04 15:38:46.338717738 +0000 UTC m=+1133.773387595" observedRunningTime="2025-12-04 15:38:47.286173936 +0000 UTC m=+1134.720843783" watchObservedRunningTime="2025-12-04 15:38:47.28631107 +0000 UTC m=+1134.720980927" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.241639 4676 generic.go:334] "Generic (PLEG): container finished" podID="85e23715-9b6f-4307-97f5-36289341911d" containerID="3fcb2c69f4e90e86f37965b7715aa2fd4009c24cf1d8665b16c457c6c0eff841" exitCode=0 Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.242057 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gh7lx" event={"ID":"85e23715-9b6f-4307-97f5-36289341911d","Type":"ContainerDied","Data":"3fcb2c69f4e90e86f37965b7715aa2fd4009c24cf1d8665b16c457c6c0eff841"} Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.333048 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2e42-account-create-hzn75"] Dec 04 15:38:48 crc kubenswrapper[4676]: E1204 15:38:48.333808 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fec9aa8-63ba-40bb-9217-590ae458da93" containerName="mariadb-database-create" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.333836 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fec9aa8-63ba-40bb-9217-590ae458da93" containerName="mariadb-database-create" Dec 04 15:38:48 crc kubenswrapper[4676]: E1204 15:38:48.333862 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="504e890d-08fd-41c1-b1cd-f0a9480e17df" containerName="mariadb-database-create" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.333871 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="504e890d-08fd-41c1-b1cd-f0a9480e17df" containerName="mariadb-database-create" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.334261 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fec9aa8-63ba-40bb-9217-590ae458da93" containerName="mariadb-database-create" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.334296 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="504e890d-08fd-41c1-b1cd-f0a9480e17df" containerName="mariadb-database-create" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.335140 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2e42-account-create-hzn75" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.343234 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2e42-account-create-hzn75"] Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.345622 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.515508 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c87fh\" (UniqueName: \"kubernetes.io/projected/6e4c2e6a-2e63-4f64-9e3c-c14e6226727a-kube-api-access-c87fh\") pod \"barbican-2e42-account-create-hzn75\" (UID: \"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a\") " pod="openstack/barbican-2e42-account-create-hzn75" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.527498 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f615-account-create-nscvm"] Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.529010 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f615-account-create-nscvm" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.532236 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.535526 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f615-account-create-nscvm"] Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.617644 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c87fh\" (UniqueName: \"kubernetes.io/projected/6e4c2e6a-2e63-4f64-9e3c-c14e6226727a-kube-api-access-c87fh\") pod \"barbican-2e42-account-create-hzn75\" (UID: \"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a\") " pod="openstack/barbican-2e42-account-create-hzn75" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.648092 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c87fh\" (UniqueName: \"kubernetes.io/projected/6e4c2e6a-2e63-4f64-9e3c-c14e6226727a-kube-api-access-c87fh\") pod \"barbican-2e42-account-create-hzn75\" (UID: \"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a\") " pod="openstack/barbican-2e42-account-create-hzn75" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.712899 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2e42-account-create-hzn75" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.719649 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjvx6\" (UniqueName: \"kubernetes.io/projected/742fbc26-b6af-40c0-bd23-9c6bacbbe61c-kube-api-access-kjvx6\") pod \"cinder-f615-account-create-nscvm\" (UID: \"742fbc26-b6af-40c0-bd23-9c6bacbbe61c\") " pod="openstack/cinder-f615-account-create-nscvm" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.730562 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lxbp8" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.820966 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjvx6\" (UniqueName: \"kubernetes.io/projected/742fbc26-b6af-40c0-bd23-9c6bacbbe61c-kube-api-access-kjvx6\") pod \"cinder-f615-account-create-nscvm\" (UID: \"742fbc26-b6af-40c0-bd23-9c6bacbbe61c\") " pod="openstack/cinder-f615-account-create-nscvm" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.845421 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjvx6\" (UniqueName: \"kubernetes.io/projected/742fbc26-b6af-40c0-bd23-9c6bacbbe61c-kube-api-access-kjvx6\") pod \"cinder-f615-account-create-nscvm\" (UID: \"742fbc26-b6af-40c0-bd23-9c6bacbbe61c\") " pod="openstack/cinder-f615-account-create-nscvm" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.859456 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f615-account-create-nscvm" Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.921759 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcllq\" (UniqueName: \"kubernetes.io/projected/20d4d2e0-ea26-476f-b7e6-fd922c301ba0-kube-api-access-fcllq\") pod \"20d4d2e0-ea26-476f-b7e6-fd922c301ba0\" (UID: \"20d4d2e0-ea26-476f-b7e6-fd922c301ba0\") " Dec 04 15:38:48 crc kubenswrapper[4676]: I1204 15:38:48.926366 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20d4d2e0-ea26-476f-b7e6-fd922c301ba0-kube-api-access-fcllq" (OuterVolumeSpecName: "kube-api-access-fcllq") pod "20d4d2e0-ea26-476f-b7e6-fd922c301ba0" (UID: "20d4d2e0-ea26-476f-b7e6-fd922c301ba0"). InnerVolumeSpecName "kube-api-access-fcllq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:49 crc kubenswrapper[4676]: I1204 15:38:49.024174 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcllq\" (UniqueName: \"kubernetes.io/projected/20d4d2e0-ea26-476f-b7e6-fd922c301ba0-kube-api-access-fcllq\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:49 crc kubenswrapper[4676]: I1204 15:38:49.164341 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2e42-account-create-hzn75"] Dec 04 15:38:49 crc kubenswrapper[4676]: W1204 15:38:49.172309 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e4c2e6a_2e63_4f64_9e3c_c14e6226727a.slice/crio-d172968ed0a6e1d2aac892da1d5da115a217d360e8fd4119581a4555a1329b8e WatchSource:0}: Error finding container d172968ed0a6e1d2aac892da1d5da115a217d360e8fd4119581a4555a1329b8e: Status 404 returned error can't find the container with id d172968ed0a6e1d2aac892da1d5da115a217d360e8fd4119581a4555a1329b8e Dec 04 15:38:49 crc kubenswrapper[4676]: I1204 15:38:49.256267 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2e42-account-create-hzn75" event={"ID":"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a","Type":"ContainerStarted","Data":"d172968ed0a6e1d2aac892da1d5da115a217d360e8fd4119581a4555a1329b8e"} Dec 04 15:38:49 crc kubenswrapper[4676]: I1204 15:38:49.258780 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lxbp8" event={"ID":"20d4d2e0-ea26-476f-b7e6-fd922c301ba0","Type":"ContainerDied","Data":"f02bfd09791951221107e4e3d96978561aedec897b0ce0049cccbc34755174b7"} Dec 04 15:38:49 crc kubenswrapper[4676]: I1204 15:38:49.259089 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02bfd09791951221107e4e3d96978561aedec897b0ce0049cccbc34755174b7" Dec 04 15:38:49 crc kubenswrapper[4676]: I1204 15:38:49.259160 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lxbp8" Dec 04 15:38:49 crc kubenswrapper[4676]: I1204 15:38:49.333767 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f615-account-create-nscvm"] Dec 04 15:38:50 crc kubenswrapper[4676]: I1204 15:38:50.274876 4676 generic.go:334] "Generic (PLEG): container finished" podID="6e4c2e6a-2e63-4f64-9e3c-c14e6226727a" containerID="61cb2c916525a98cf6a703c62e3947df817ddaff8a4fe1391b8b2b16b28219cd" exitCode=0 Dec 04 15:38:50 crc kubenswrapper[4676]: I1204 15:38:50.274995 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2e42-account-create-hzn75" event={"ID":"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a","Type":"ContainerDied","Data":"61cb2c916525a98cf6a703c62e3947df817ddaff8a4fe1391b8b2b16b28219cd"} Dec 04 15:38:54 crc kubenswrapper[4676]: I1204 15:38:54.812067 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gh7lx" Dec 04 15:38:54 crc kubenswrapper[4676]: I1204 15:38:54.820865 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2e42-account-create-hzn75" Dec 04 15:38:54 crc kubenswrapper[4676]: I1204 15:38:54.931114 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c87fh\" (UniqueName: \"kubernetes.io/projected/6e4c2e6a-2e63-4f64-9e3c-c14e6226727a-kube-api-access-c87fh\") pod \"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a\" (UID: \"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a\") " Dec 04 15:38:54 crc kubenswrapper[4676]: I1204 15:38:54.931399 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr4w6\" (UniqueName: \"kubernetes.io/projected/85e23715-9b6f-4307-97f5-36289341911d-kube-api-access-gr4w6\") pod \"85e23715-9b6f-4307-97f5-36289341911d\" (UID: \"85e23715-9b6f-4307-97f5-36289341911d\") " Dec 04 15:38:54 crc kubenswrapper[4676]: I1204 15:38:54.938245 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4c2e6a-2e63-4f64-9e3c-c14e6226727a-kube-api-access-c87fh" (OuterVolumeSpecName: "kube-api-access-c87fh") pod "6e4c2e6a-2e63-4f64-9e3c-c14e6226727a" (UID: "6e4c2e6a-2e63-4f64-9e3c-c14e6226727a"). InnerVolumeSpecName "kube-api-access-c87fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:54 crc kubenswrapper[4676]: I1204 15:38:54.939252 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e23715-9b6f-4307-97f5-36289341911d-kube-api-access-gr4w6" (OuterVolumeSpecName: "kube-api-access-gr4w6") pod "85e23715-9b6f-4307-97f5-36289341911d" (UID: "85e23715-9b6f-4307-97f5-36289341911d"). InnerVolumeSpecName "kube-api-access-gr4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.033939 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr4w6\" (UniqueName: \"kubernetes.io/projected/85e23715-9b6f-4307-97f5-36289341911d-kube-api-access-gr4w6\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.033991 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c87fh\" (UniqueName: \"kubernetes.io/projected/6e4c2e6a-2e63-4f64-9e3c-c14e6226727a-kube-api-access-c87fh\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.325411 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2e42-account-create-hzn75" event={"ID":"6e4c2e6a-2e63-4f64-9e3c-c14e6226727a","Type":"ContainerDied","Data":"d172968ed0a6e1d2aac892da1d5da115a217d360e8fd4119581a4555a1329b8e"} Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.325454 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d172968ed0a6e1d2aac892da1d5da115a217d360e8fd4119581a4555a1329b8e" Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.325490 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2e42-account-create-hzn75" Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.330382 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-gh7lx" event={"ID":"85e23715-9b6f-4307-97f5-36289341911d","Type":"ContainerDied","Data":"ffe71c1ccebcfba5db37eb19fe04b5e829f9faaa32826121e8c9e79401e63d6f"} Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.330414 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffe71c1ccebcfba5db37eb19fe04b5e829f9faaa32826121e8c9e79401e63d6f" Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.330497 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-gh7lx" Dec 04 15:38:55 crc kubenswrapper[4676]: I1204 15:38:55.333380 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 04 15:38:56 crc kubenswrapper[4676]: I1204 15:38:56.345936 4676 generic.go:334] "Generic (PLEG): container finished" podID="742fbc26-b6af-40c0-bd23-9c6bacbbe61c" containerID="7def4f7329a205bf4dab65733e4954db001e46a20fe3862d1c3b58576f64f8dd" exitCode=0 Dec 04 15:38:56 crc kubenswrapper[4676]: I1204 15:38:56.346286 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f615-account-create-nscvm" event={"ID":"742fbc26-b6af-40c0-bd23-9c6bacbbe61c","Type":"ContainerDied","Data":"7def4f7329a205bf4dab65733e4954db001e46a20fe3862d1c3b58576f64f8dd"} Dec 04 15:38:56 crc kubenswrapper[4676]: I1204 15:38:56.346320 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f615-account-create-nscvm" event={"ID":"742fbc26-b6af-40c0-bd23-9c6bacbbe61c","Type":"ContainerStarted","Data":"10e2c47a9c756690638c65eab0a99c13b5b7684fd485829a30d671f9c6d784de"} Dec 04 15:38:56 crc kubenswrapper[4676]: I1204 15:38:56.349278 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-cmrp2" event={"ID":"063e66f9-8c76-4a2c-9392-f35b247d1304","Type":"ContainerStarted","Data":"9ac6a9b70e7cb8225f2fff4e9dcf7c078f8b53f35739ef899a5b0e7928318e06"} Dec 04 15:38:56 crc kubenswrapper[4676]: I1204 15:38:56.385399 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-cmrp2" podStartSLOduration=6.071131008 podStartE2EDuration="15.385351948s" podCreationTimestamp="2025-12-04 15:38:41 +0000 UTC" firstStartedPulling="2025-12-04 15:38:46.43448979 +0000 UTC m=+1133.869159647" lastFinishedPulling="2025-12-04 15:38:55.74871073 +0000 UTC m=+1143.183380587" observedRunningTime="2025-12-04 15:38:56.379423365 +0000 UTC m=+1143.814093232" watchObservedRunningTime="2025-12-04 15:38:56.385351948 +0000 UTC m=+1143.820021795" Dec 04 15:38:57 crc kubenswrapper[4676]: I1204 15:38:57.759712 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f615-account-create-nscvm" Dec 04 15:38:57 crc kubenswrapper[4676]: I1204 15:38:57.918797 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjvx6\" (UniqueName: \"kubernetes.io/projected/742fbc26-b6af-40c0-bd23-9c6bacbbe61c-kube-api-access-kjvx6\") pod \"742fbc26-b6af-40c0-bd23-9c6bacbbe61c\" (UID: \"742fbc26-b6af-40c0-bd23-9c6bacbbe61c\") " Dec 04 15:38:57 crc kubenswrapper[4676]: I1204 15:38:57.933896 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742fbc26-b6af-40c0-bd23-9c6bacbbe61c-kube-api-access-kjvx6" (OuterVolumeSpecName: "kube-api-access-kjvx6") pod "742fbc26-b6af-40c0-bd23-9c6bacbbe61c" (UID: "742fbc26-b6af-40c0-bd23-9c6bacbbe61c"). InnerVolumeSpecName "kube-api-access-kjvx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:58 crc kubenswrapper[4676]: I1204 15:38:58.021936 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjvx6\" (UniqueName: \"kubernetes.io/projected/742fbc26-b6af-40c0-bd23-9c6bacbbe61c-kube-api-access-kjvx6\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:58 crc kubenswrapper[4676]: I1204 15:38:58.369159 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f615-account-create-nscvm" event={"ID":"742fbc26-b6af-40c0-bd23-9c6bacbbe61c","Type":"ContainerDied","Data":"10e2c47a9c756690638c65eab0a99c13b5b7684fd485829a30d671f9c6d784de"} Dec 04 15:38:58 crc kubenswrapper[4676]: I1204 15:38:58.369202 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e2c47a9c756690638c65eab0a99c13b5b7684fd485829a30d671f9c6d784de" Dec 04 15:38:58 crc kubenswrapper[4676]: I1204 15:38:58.369224 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f615-account-create-nscvm" Dec 04 15:38:58 crc kubenswrapper[4676]: I1204 15:38:58.370476 4676 generic.go:334] "Generic (PLEG): container finished" podID="12cc7c9f-d211-490e-b297-0a250646e111" containerID="bc4865c331287eaeefe44663a1a8a1cf9db6740287d27940ab743e1f0e51b2b3" exitCode=0 Dec 04 15:38:58 crc kubenswrapper[4676]: I1204 15:38:58.370503 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xhjnc" event={"ID":"12cc7c9f-d211-490e-b297-0a250646e111","Type":"ContainerDied","Data":"bc4865c331287eaeefe44663a1a8a1cf9db6740287d27940ab743e1f0e51b2b3"} Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.379895 4676 generic.go:334] "Generic (PLEG): container finished" podID="063e66f9-8c76-4a2c-9392-f35b247d1304" containerID="9ac6a9b70e7cb8225f2fff4e9dcf7c078f8b53f35739ef899a5b0e7928318e06" exitCode=0 Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.379949 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-cmrp2" event={"ID":"063e66f9-8c76-4a2c-9392-f35b247d1304","Type":"ContainerDied","Data":"9ac6a9b70e7cb8225f2fff4e9dcf7c078f8b53f35739ef899a5b0e7928318e06"} Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.703134 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.847170 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-combined-ca-bundle\") pod \"12cc7c9f-d211-490e-b297-0a250646e111\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.847277 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-config-data\") pod \"12cc7c9f-d211-490e-b297-0a250646e111\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.847385 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8sr\" (UniqueName: \"kubernetes.io/projected/12cc7c9f-d211-490e-b297-0a250646e111-kube-api-access-9v8sr\") pod \"12cc7c9f-d211-490e-b297-0a250646e111\" (UID: \"12cc7c9f-d211-490e-b297-0a250646e111\") " Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.852861 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12cc7c9f-d211-490e-b297-0a250646e111-kube-api-access-9v8sr" (OuterVolumeSpecName: "kube-api-access-9v8sr") pod "12cc7c9f-d211-490e-b297-0a250646e111" (UID: "12cc7c9f-d211-490e-b297-0a250646e111"). InnerVolumeSpecName "kube-api-access-9v8sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.874823 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12cc7c9f-d211-490e-b297-0a250646e111" (UID: "12cc7c9f-d211-490e-b297-0a250646e111"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.892264 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-config-data" (OuterVolumeSpecName: "config-data") pod "12cc7c9f-d211-490e-b297-0a250646e111" (UID: "12cc7c9f-d211-490e-b297-0a250646e111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.949497 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8sr\" (UniqueName: \"kubernetes.io/projected/12cc7c9f-d211-490e-b297-0a250646e111-kube-api-access-9v8sr\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.949546 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:38:59 crc kubenswrapper[4676]: I1204 15:38:59.949560 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12cc7c9f-d211-490e-b297-0a250646e111-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.393461 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-xhjnc" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.395092 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-xhjnc" event={"ID":"12cc7c9f-d211-490e-b297-0a250646e111","Type":"ContainerDied","Data":"eb057a60ec84d7523c968d44528945f274eb02daaa74abb14d329445ed36520f"} Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.395175 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb057a60ec84d7523c968d44528945f274eb02daaa74abb14d329445ed36520f" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.620570 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.688858 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-854f4d7cbc-mkbcm"] Dec 04 15:39:00 crc kubenswrapper[4676]: E1204 15:39:00.689488 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e23715-9b6f-4307-97f5-36289341911d" containerName="mariadb-database-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689521 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e23715-9b6f-4307-97f5-36289341911d" containerName="mariadb-database-create" Dec 04 15:39:00 crc kubenswrapper[4676]: E1204 15:39:00.689545 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742fbc26-b6af-40c0-bd23-9c6bacbbe61c" containerName="mariadb-account-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689552 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="742fbc26-b6af-40c0-bd23-9c6bacbbe61c" containerName="mariadb-account-create" Dec 04 15:39:00 crc kubenswrapper[4676]: E1204 15:39:00.689568 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12cc7c9f-d211-490e-b297-0a250646e111" containerName="keystone-db-sync" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689574 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="12cc7c9f-d211-490e-b297-0a250646e111" containerName="keystone-db-sync" Dec 04 15:39:00 crc kubenswrapper[4676]: E1204 15:39:00.689598 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4c2e6a-2e63-4f64-9e3c-c14e6226727a" containerName="mariadb-account-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689604 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4c2e6a-2e63-4f64-9e3c-c14e6226727a" containerName="mariadb-account-create" Dec 04 15:39:00 crc kubenswrapper[4676]: E1204 15:39:00.689619 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20d4d2e0-ea26-476f-b7e6-fd922c301ba0" containerName="mariadb-database-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689627 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="20d4d2e0-ea26-476f-b7e6-fd922c301ba0" containerName="mariadb-database-create" Dec 04 15:39:00 crc kubenswrapper[4676]: E1204 15:39:00.689636 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="063e66f9-8c76-4a2c-9392-f35b247d1304" containerName="watcher-db-sync" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689643 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="063e66f9-8c76-4a2c-9392-f35b247d1304" containerName="watcher-db-sync" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689924 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="12cc7c9f-d211-490e-b297-0a250646e111" containerName="keystone-db-sync" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689950 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="742fbc26-b6af-40c0-bd23-9c6bacbbe61c" containerName="mariadb-account-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689960 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="063e66f9-8c76-4a2c-9392-f35b247d1304" containerName="watcher-db-sync" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689974 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4c2e6a-2e63-4f64-9e3c-c14e6226727a" containerName="mariadb-account-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689991 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="20d4d2e0-ea26-476f-b7e6-fd922c301ba0" containerName="mariadb-database-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.689999 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e23715-9b6f-4307-97f5-36289341911d" containerName="mariadb-database-create" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.691389 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.707544 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-854f4d7cbc-mkbcm"] Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.721849 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-hnngv"] Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.726078 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.729850 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.731002 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.731096 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vsxn6" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.731216 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.753464 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hnngv"] Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.766830 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-combined-ca-bundle\") pod \"063e66f9-8c76-4a2c-9392-f35b247d1304\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.766925 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xbm7\" (UniqueName: \"kubernetes.io/projected/063e66f9-8c76-4a2c-9392-f35b247d1304-kube-api-access-9xbm7\") pod \"063e66f9-8c76-4a2c-9392-f35b247d1304\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.767082 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-config-data\") pod \"063e66f9-8c76-4a2c-9392-f35b247d1304\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.767142 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-db-sync-config-data\") pod \"063e66f9-8c76-4a2c-9392-f35b247d1304\" (UID: \"063e66f9-8c76-4a2c-9392-f35b247d1304\") " Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.781462 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/063e66f9-8c76-4a2c-9392-f35b247d1304-kube-api-access-9xbm7" (OuterVolumeSpecName: "kube-api-access-9xbm7") pod "063e66f9-8c76-4a2c-9392-f35b247d1304" (UID: "063e66f9-8c76-4a2c-9392-f35b247d1304"). InnerVolumeSpecName "kube-api-access-9xbm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.783672 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "063e66f9-8c76-4a2c-9392-f35b247d1304" (UID: "063e66f9-8c76-4a2c-9392-f35b247d1304"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.805188 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "063e66f9-8c76-4a2c-9392-f35b247d1304" (UID: "063e66f9-8c76-4a2c-9392-f35b247d1304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.880194 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-combined-ca-bundle\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992112 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs8vv\" (UniqueName: \"kubernetes.io/projected/214010a3-d12a-4ff2-94b9-c0613c81d389-kube-api-access-rs8vv\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992206 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-svc\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992271 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-scripts\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992342 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-config-data\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992439 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmsxf\" (UniqueName: \"kubernetes.io/projected/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-kube-api-access-tmsxf\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992547 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-config\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992659 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-fernet-keys\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992752 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-sb\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992831 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-credential-keys\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992868 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-nb\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.992941 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-swift-storage-0\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.993017 4676 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.993035 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:00 crc kubenswrapper[4676]: I1204 15:39:00.993052 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xbm7\" (UniqueName: \"kubernetes.io/projected/063e66f9-8c76-4a2c-9392-f35b247d1304-kube-api-access-9xbm7\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.029107 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-config-data" (OuterVolumeSpecName: "config-data") pod "063e66f9-8c76-4a2c-9392-f35b247d1304" (UID: "063e66f9-8c76-4a2c-9392-f35b247d1304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.075870 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-764d75d947-w4sq5"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.077350 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.083303 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.083360 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.083306 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.083663 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mn58s" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094590 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-fernet-keys\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094645 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-sb\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094684 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-credential-keys\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094706 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-nb\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094733 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-swift-storage-0\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094784 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-combined-ca-bundle\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094810 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs8vv\" (UniqueName: \"kubernetes.io/projected/214010a3-d12a-4ff2-94b9-c0613c81d389-kube-api-access-rs8vv\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094838 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-svc\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094859 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-scripts\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094885 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-config-data\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094936 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmsxf\" (UniqueName: \"kubernetes.io/projected/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-kube-api-access-tmsxf\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.094964 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-config\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.095413 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-764d75d947-w4sq5"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.103730 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/063e66f9-8c76-4a2c-9392-f35b247d1304-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.104679 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-config\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.110606 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-fernet-keys\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.113474 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-sb\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.115712 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-combined-ca-bundle\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.115728 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-svc\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.116503 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-swift-storage-0\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.117729 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-nb\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.120335 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-config-data\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.135291 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-credential-keys\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.158147 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-scripts\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.162987 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmsxf\" (UniqueName: \"kubernetes.io/projected/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-kube-api-access-tmsxf\") pod \"keystone-bootstrap-hnngv\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.179382 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs8vv\" (UniqueName: \"kubernetes.io/projected/214010a3-d12a-4ff2-94b9-c0613c81d389-kube-api-access-rs8vv\") pod \"dnsmasq-dns-854f4d7cbc-mkbcm\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.190732 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.192850 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.198569 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.199582 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.204650 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-config-data\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.204693 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b7993-6fce-4369-8a5c-ce88e185a83f-horizon-secret-key\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.204729 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdggp\" (UniqueName: \"kubernetes.io/projected/342b7993-6fce-4369-8a5c-ce88e185a83f-kube-api-access-bdggp\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.204803 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b7993-6fce-4369-8a5c-ce88e185a83f-logs\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.204888 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-scripts\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.332482 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335000 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-scripts\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335108 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335157 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rxsr\" (UniqueName: \"kubernetes.io/projected/6cfbf976-db77-44d0-9a80-83648d806eea-kube-api-access-9rxsr\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335220 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-config-data\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335262 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b7993-6fce-4369-8a5c-ce88e185a83f-horizon-secret-key\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335335 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdggp\" (UniqueName: \"kubernetes.io/projected/342b7993-6fce-4369-8a5c-ce88e185a83f-kube-api-access-bdggp\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335391 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-config-data\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335424 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-log-httpd\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335462 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b7993-6fce-4369-8a5c-ce88e185a83f-logs\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335545 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-run-httpd\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335603 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-scripts\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.335643 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.343329 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-config-data\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.350766 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.366236 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b7993-6fce-4369-8a5c-ce88e185a83f-logs\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.537580 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-scripts\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.539629 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b7993-6fce-4369-8a5c-ce88e185a83f-horizon-secret-key\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.546001 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.546063 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rxsr\" (UniqueName: \"kubernetes.io/projected/6cfbf976-db77-44d0-9a80-83648d806eea-kube-api-access-9rxsr\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.546172 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-config-data\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.546206 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-log-httpd\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.546269 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-run-httpd\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.546305 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-scripts\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.546342 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.555650 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdggp\" (UniqueName: \"kubernetes.io/projected/342b7993-6fce-4369-8a5c-ce88e185a83f-kube-api-access-bdggp\") pod \"horizon-764d75d947-w4sq5\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.558565 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-log-httpd\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.565498 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-run-httpd\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.570657 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.581351 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-scripts\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.597721 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-config-data\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.598195 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.616540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rxsr\" (UniqueName: \"kubernetes.io/projected/6cfbf976-db77-44d0-9a80-83648d806eea-kube-api-access-9rxsr\") pod \"ceilometer-0\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.618579 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.626137 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-cmrp2" event={"ID":"063e66f9-8c76-4a2c-9392-f35b247d1304","Type":"ContainerDied","Data":"4dab522ba4eb914eb2a74446af7b6a8ef7e03a9a21141190205b0f092c35dcf4"} Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.626182 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dab522ba4eb914eb2a74446af7b6a8ef7e03a9a21141190205b0f092c35dcf4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.626293 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-cmrp2" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.708230 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8f696d8d9-98tv4"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.714028 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.756059 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8f696d8d9-98tv4"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.758617 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c53ed0cb-2204-41f6-8474-c4afb7b7048e-horizon-secret-key\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.758658 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-scripts\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.758675 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c53ed0cb-2204-41f6-8474-c4afb7b7048e-logs\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.758750 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhkp6\" (UniqueName: \"kubernetes.io/projected/c53ed0cb-2204-41f6-8474-c4afb7b7048e-kube-api-access-vhkp6\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.758780 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-config-data\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.761365 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:01 crc kubenswrapper[4676]: E1204 15:39:01.852160 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod063e66f9_8c76_4a2c_9392_f35b247d1304.slice\": RecentStats: unable to find data in memory cache]" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.874455 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.875922 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c53ed0cb-2204-41f6-8474-c4afb7b7048e-horizon-secret-key\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.875974 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-scripts\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.875999 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c53ed0cb-2204-41f6-8474-c4afb7b7048e-logs\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.876091 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhkp6\" (UniqueName: \"kubernetes.io/projected/c53ed0cb-2204-41f6-8474-c4afb7b7048e-kube-api-access-vhkp6\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.876129 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-config-data\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.878304 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-scripts\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.881810 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c53ed0cb-2204-41f6-8474-c4afb7b7048e-logs\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.884684 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-config-data\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.929669 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jlg26"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.930981 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.937030 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.937279 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-55k2j" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.940659 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.942002 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c53ed0cb-2204-41f6-8474-c4afb7b7048e-horizon-secret-key\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.976257 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f4d7cbc-mkbcm"] Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.980812 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhkp6\" (UniqueName: \"kubernetes.io/projected/c53ed0cb-2204-41f6-8474-c4afb7b7048e-kube-api-access-vhkp6\") pod \"horizon-8f696d8d9-98tv4\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.981512 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-config-data\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.981557 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68b9f\" (UniqueName: \"kubernetes.io/projected/89c93c13-31d1-4762-9457-90e32c63873e-kube-api-access-68b9f\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.981583 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-combined-ca-bundle\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.981632 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-scripts\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:01 crc kubenswrapper[4676]: I1204 15:39:01.981827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c93c13-31d1-4762-9457-90e32c63873e-logs\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:01.998885 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jlg26"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.046118 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-6b4sd"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.047829 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.062559 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6b4sd"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.069754 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-r22qq" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.070328 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.078708 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b9df8fb6c-mjt7v"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.080403 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.084796 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-config-data\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.084852 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68b9f\" (UniqueName: \"kubernetes.io/projected/89c93c13-31d1-4762-9457-90e32c63873e-kube-api-access-68b9f\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.084894 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-combined-ca-bundle\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.084983 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-scripts\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.085140 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c93c13-31d1-4762-9457-90e32c63873e-logs\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.085515 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c93c13-31d1-4762-9457-90e32c63873e-logs\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.091170 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2cec-account-create-fwvgn"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.092414 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2cec-account-create-fwvgn" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.101158 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-combined-ca-bundle\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.101752 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.123490 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2cec-account-create-fwvgn"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.130287 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-config-data\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.130623 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-scripts\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.133459 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9df8fb6c-mjt7v"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.153805 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209105 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-db-sync-config-data\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209167 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvk7z\" (UniqueName: \"kubernetes.io/projected/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-kube-api-access-rvk7z\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209272 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-nb\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209326 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-svc\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209439 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ltmm\" (UniqueName: \"kubernetes.io/projected/8e4af1e4-191b-483a-9886-f07cc9829079-kube-api-access-7ltmm\") pod \"glance-2cec-account-create-fwvgn\" (UID: \"8e4af1e4-191b-483a-9886-f07cc9829079\") " pod="openstack/glance-2cec-account-create-fwvgn" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209508 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7lbp\" (UniqueName: \"kubernetes.io/projected/4feecc1c-e63e-4063-947d-4c2c619525a7-kube-api-access-v7lbp\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209570 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-config\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209643 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209708 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-combined-ca-bundle\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.209739 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-swift-storage-0\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.221978 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0109-account-create-zfgjz"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.323994 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0109-account-create-zfgjz" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.325673 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-config\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.325779 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.327772 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-config\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.334450 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.334833 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-combined-ca-bundle\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.334957 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-swift-storage-0\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.335151 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-db-sync-config-data\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.335200 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvk7z\" (UniqueName: \"kubernetes.io/projected/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-kube-api-access-rvk7z\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.335276 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-nb\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.335329 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-svc\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.335495 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ltmm\" (UniqueName: \"kubernetes.io/projected/8e4af1e4-191b-483a-9886-f07cc9829079-kube-api-access-7ltmm\") pod \"glance-2cec-account-create-fwvgn\" (UID: \"8e4af1e4-191b-483a-9886-f07cc9829079\") " pod="openstack/glance-2cec-account-create-fwvgn" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.350060 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-svc\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.351249 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-nb\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.395540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-swift-storage-0\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.409786 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.410243 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-db-sync-config-data\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.410365 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0109-account-create-zfgjz"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.411689 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-combined-ca-bundle\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.437968 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7lbp\" (UniqueName: \"kubernetes.io/projected/4feecc1c-e63e-4063-947d-4c2c619525a7-kube-api-access-v7lbp\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.438044 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67ndn\" (UniqueName: \"kubernetes.io/projected/0a7820f5-8870-4da3-8576-328966fdc552-kube-api-access-67ndn\") pod \"neutron-0109-account-create-zfgjz\" (UID: \"0a7820f5-8870-4da3-8576-328966fdc552\") " pod="openstack/neutron-0109-account-create-zfgjz" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.472156 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvk7z\" (UniqueName: \"kubernetes.io/projected/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-kube-api-access-rvk7z\") pod \"dnsmasq-dns-5b9df8fb6c-mjt7v\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.480233 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ltmm\" (UniqueName: \"kubernetes.io/projected/8e4af1e4-191b-483a-9886-f07cc9829079-kube-api-access-7ltmm\") pod \"glance-2cec-account-create-fwvgn\" (UID: \"8e4af1e4-191b-483a-9886-f07cc9829079\") " pod="openstack/glance-2cec-account-create-fwvgn" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.508062 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.509386 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.511107 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68b9f\" (UniqueName: \"kubernetes.io/projected/89c93c13-31d1-4762-9457-90e32c63873e-kube-api-access-68b9f\") pod \"placement-db-sync-jlg26\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.515199 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.519648 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.540182 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfptw\" (UniqueName: \"kubernetes.io/projected/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-kube-api-access-bfptw\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.540305 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.540340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-logs\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.540430 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67ndn\" (UniqueName: \"kubernetes.io/projected/0a7820f5-8870-4da3-8576-328966fdc552-kube-api-access-67ndn\") pod \"neutron-0109-account-create-zfgjz\" (UID: \"0a7820f5-8870-4da3-8576-328966fdc552\") " pod="openstack/neutron-0109-account-create-zfgjz" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.540469 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.540519 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.555402 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.556190 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-h2jgj" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.562354 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7lbp\" (UniqueName: \"kubernetes.io/projected/4feecc1c-e63e-4063-947d-4c2c619525a7-kube-api-access-v7lbp\") pod \"barbican-db-sync-6b4sd\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.699498 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.703347 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.705579 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.705615 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-logs\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.705706 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.705764 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.705843 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfptw\" (UniqueName: \"kubernetes.io/projected/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-kube-api-access-bfptw\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.707373 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-logs\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.736605 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-config-data\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.737241 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.757638 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67ndn\" (UniqueName: \"kubernetes.io/projected/0a7820f5-8870-4da3-8576-328966fdc552-kube-api-access-67ndn\") pod \"neutron-0109-account-create-zfgjz\" (UID: \"0a7820f5-8870-4da3-8576-328966fdc552\") " pod="openstack/neutron-0109-account-create-zfgjz" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.758481 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.772334 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2cec-account-create-fwvgn" Dec 04 15:39:02 crc kubenswrapper[4676]: I1204 15:39:02.800303 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfptw\" (UniqueName: \"kubernetes.io/projected/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-kube-api-access-bfptw\") pod \"watcher-decision-engine-0\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:02.818163 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0109-account-create-zfgjz" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:02.900290 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.194770 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.199206 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.208225 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.218322 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.220138 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.227235 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.241916 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.262524 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.362814 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-config-data\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.362896 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.362965 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.363056 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4sb7\" (UniqueName: \"kubernetes.io/projected/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-kube-api-access-t4sb7\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.363110 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.363150 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjqst\" (UniqueName: \"kubernetes.io/projected/aefbcd15-a508-4c33-9e9a-1e98106e3949-kube-api-access-tjqst\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.363196 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-config-data\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.363238 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aefbcd15-a508-4c33-9e9a-1e98106e3949-logs\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.363280 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-logs\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496214 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-config-data\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496291 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496329 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496404 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4sb7\" (UniqueName: \"kubernetes.io/projected/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-kube-api-access-t4sb7\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496444 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496513 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjqst\" (UniqueName: \"kubernetes.io/projected/aefbcd15-a508-4c33-9e9a-1e98106e3949-kube-api-access-tjqst\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496570 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-config-data\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496608 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aefbcd15-a508-4c33-9e9a-1e98106e3949-logs\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.496639 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-logs\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.497226 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-logs\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.497790 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aefbcd15-a508-4c33-9e9a-1e98106e3949-logs\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.505059 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-config-data\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.505608 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.506689 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-config-data\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.526766 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.535549 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.540826 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4sb7\" (UniqueName: \"kubernetes.io/projected/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-kube-api-access-t4sb7\") pod \"watcher-api-0\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " pod="openstack/watcher-api-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.547512 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjqst\" (UniqueName: \"kubernetes.io/projected/aefbcd15-a508-4c33-9e9a-1e98106e3949-kube-api-access-tjqst\") pod \"watcher-applier-0\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.609969 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 15:39:03 crc kubenswrapper[4676]: I1204 15:39:03.624735 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.009421 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-nnr52"] Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.014872 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.026287 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.026732 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.026852 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jh4hq" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.066658 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nnr52"] Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.167600 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-config-data\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.167684 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-scripts\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.167712 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-combined-ca-bundle\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.167748 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-db-sync-config-data\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.167802 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq29l\" (UniqueName: \"kubernetes.io/projected/c8534e22-ee3e-4b6c-92a8-1790b69f335d-kube-api-access-wq29l\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.167840 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8534e22-ee3e-4b6c-92a8-1790b69f335d-etc-machine-id\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.237481 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f4d7cbc-mkbcm"] Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.330383 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-config-data\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.330459 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-scripts\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.330486 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-combined-ca-bundle\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.330509 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-db-sync-config-data\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.330550 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq29l\" (UniqueName: \"kubernetes.io/projected/c8534e22-ee3e-4b6c-92a8-1790b69f335d-kube-api-access-wq29l\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.330576 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8534e22-ee3e-4b6c-92a8-1790b69f335d-etc-machine-id\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.330717 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8534e22-ee3e-4b6c-92a8-1790b69f335d-etc-machine-id\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.359732 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-db-sync-config-data\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.360299 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-combined-ca-bundle\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.363674 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-config-data\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.384087 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-scripts\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.394672 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq29l\" (UniqueName: \"kubernetes.io/projected/c8534e22-ee3e-4b6c-92a8-1790b69f335d-kube-api-access-wq29l\") pod \"cinder-db-sync-nnr52\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.403946 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-hnngv"] Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.423355 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-764d75d947-w4sq5"] Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.444201 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr52" Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.907700 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" event={"ID":"214010a3-d12a-4ff2-94b9-c0613c81d389","Type":"ContainerStarted","Data":"c8c6e1fef27f8f019e35d7dab85beb284352963b16ed982cb1a6e261f21f1157"} Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.908977 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnngv" event={"ID":"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f","Type":"ContainerStarted","Data":"d9d6e9c850df1a8f333c5216a7978234f6691ecc68a4d160f6b1ced28f6c300a"} Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.909950 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-764d75d947-w4sq5" event={"ID":"342b7993-6fce-4369-8a5c-ce88e185a83f","Type":"ContainerStarted","Data":"489f385f312169b1a8c478875f46b3ecce88fcb8312ea3e90c85a1fffff70feb"} Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.960191 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:04 crc kubenswrapper[4676]: I1204 15:39:04.974975 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-764d75d947-w4sq5"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.015794 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0109-account-create-zfgjz"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.039004 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b9b49fb9-6mlqm"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.040956 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.054469 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9df8fb6c-mjt7v"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.078413 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b9b49fb9-6mlqm"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.105350 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-scripts\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.105459 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-horizon-secret-key\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.105581 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx26z\" (UniqueName: \"kubernetes.io/projected/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-kube-api-access-sx26z\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.105639 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-config-data\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.105667 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-logs\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.265896 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-6b4sd"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.268736 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2cec-account-create-fwvgn"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.327279 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-horizon-secret-key\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.327541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx26z\" (UniqueName: \"kubernetes.io/projected/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-kube-api-access-sx26z\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.327606 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-config-data\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.327646 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-logs\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.327769 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-scripts\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.328616 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-scripts\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.342583 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-config-data\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.342869 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-logs\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.360724 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-horizon-secret-key\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.464675 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx26z\" (UniqueName: \"kubernetes.io/projected/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-kube-api-access-sx26z\") pod \"horizon-6b9b49fb9-6mlqm\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.497646 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.497705 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8f696d8d9-98tv4"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.512171 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jlg26"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.710122 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.730309 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.784411 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.836493 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.871543 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.946591 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" event={"ID":"3ed7fb0d-bb13-44f2-9e12-fe5829c660af","Type":"ContainerStarted","Data":"31c6ceb3936a713da96f40d993d079b8ecec8545890b51a5cb2f5a4156771764"} Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.951288 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f696d8d9-98tv4" event={"ID":"c53ed0cb-2204-41f6-8474-c4afb7b7048e","Type":"ContainerStarted","Data":"00857035fd7485983bae741b2ebbab80eb37606bab6587fceb3ce1a43dbd6014"} Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.970092 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerStarted","Data":"904535432fc47f7b1b1d3c4610189de6b90c1e1c083943416a48bcc79a1e46d1"} Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.976793 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerStarted","Data":"32e5f16028fc90f5de5f4035e8874d01204e1ccd47f0c4d902d36022562e20e7"} Dec 04 15:39:05 crc kubenswrapper[4676]: I1204 15:39:05.984972 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b03adc1c-f52f-4170-8ba0-d8d24da99bb9","Type":"ContainerStarted","Data":"69c687050a25a19c894e9bb0cdc038d6e977abdb0683f5db8e7afddd122a9f60"} Dec 04 15:39:06 crc kubenswrapper[4676]: I1204 15:39:06.019120 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6b4sd" event={"ID":"4feecc1c-e63e-4063-947d-4c2c619525a7","Type":"ContainerStarted","Data":"51c5b9d3a7fad3f4558a2774d0f486b3ef89b76f2e698a1fe200334f1d8fca30"} Dec 04 15:39:06 crc kubenswrapper[4676]: I1204 15:39:06.036109 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jlg26" event={"ID":"89c93c13-31d1-4762-9457-90e32c63873e","Type":"ContainerStarted","Data":"9131dfc0f2beff70028a7dd02300fcecae8b70b1388bd9bb748563e5edb563b7"} Dec 04 15:39:06 crc kubenswrapper[4676]: I1204 15:39:06.044052 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2cec-account-create-fwvgn" event={"ID":"8e4af1e4-191b-483a-9886-f07cc9829079","Type":"ContainerStarted","Data":"2ca689f06de1a60255abbd33cb67fdd9d964fc5f6023c3923701c0e60ff06360"} Dec 04 15:39:06 crc kubenswrapper[4676]: I1204 15:39:06.266736 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"aefbcd15-a508-4c33-9e9a-1e98106e3949","Type":"ContainerStarted","Data":"e1ffc908d8c9256437066011cf16e9ecc94c767fbda7785e52a77c16d3fe45e5"} Dec 04 15:39:06 crc kubenswrapper[4676]: I1204 15:39:06.290147 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0109-account-create-zfgjz" event={"ID":"0a7820f5-8870-4da3-8576-328966fdc552","Type":"ContainerStarted","Data":"faf15d514e1f0535b5d4819e64586b5835b7dc391d95a448130634e7d8fe1df1"} Dec 04 15:39:06 crc kubenswrapper[4676]: I1204 15:39:06.823670 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b9b49fb9-6mlqm"] Dec 04 15:39:06 crc kubenswrapper[4676]: I1204 15:39:06.839925 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-nnr52"] Dec 04 15:39:06 crc kubenswrapper[4676]: W1204 15:39:06.841489 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ad42d36_c6d4_4145_bfb1_c91bf3ca64c0.slice/crio-da8633e0b900558b54e4bb9d959f7e5466fd8d39159450a3f431c56f2db4c51e WatchSource:0}: Error finding container da8633e0b900558b54e4bb9d959f7e5466fd8d39159450a3f431c56f2db4c51e: Status 404 returned error can't find the container with id da8633e0b900558b54e4bb9d959f7e5466fd8d39159450a3f431c56f2db4c51e Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.315483 4676 generic.go:334] "Generic (PLEG): container finished" podID="214010a3-d12a-4ff2-94b9-c0613c81d389" containerID="6c1fb78f3d3fa352efb2e7728be5d8944bce7fc39dd7320622f57bd18a8ce6ff" exitCode=0 Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.315881 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" event={"ID":"214010a3-d12a-4ff2-94b9-c0613c81d389","Type":"ContainerDied","Data":"6c1fb78f3d3fa352efb2e7728be5d8944bce7fc39dd7320622f57bd18a8ce6ff"} Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.330853 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0109-account-create-zfgjz" event={"ID":"0a7820f5-8870-4da3-8576-328966fdc552","Type":"ContainerStarted","Data":"75ac6da838127e7ec899ecc0d54e089850e02abf2537c18b4c89930e36b0566a"} Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.334281 4676 generic.go:334] "Generic (PLEG): container finished" podID="8e4af1e4-191b-483a-9886-f07cc9829079" containerID="968e54280a93060ccd7017e5c8d8dc4184f1217ca82c03089b0584c7098f8efa" exitCode=0 Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.334367 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2cec-account-create-fwvgn" event={"ID":"8e4af1e4-191b-483a-9886-f07cc9829079","Type":"ContainerDied","Data":"968e54280a93060ccd7017e5c8d8dc4184f1217ca82c03089b0584c7098f8efa"} Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.352844 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr52" event={"ID":"c8534e22-ee3e-4b6c-92a8-1790b69f335d","Type":"ContainerStarted","Data":"9b573d1245392b34f4ba6726d28ccd8b4ddbbd5e64c607e235ec0ff8c4d7e6a7"} Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.366871 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnngv" event={"ID":"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f","Type":"ContainerStarted","Data":"11e68f22087cbef6fd18fafc8e8fc08b35bde36e595de05e4c3639967d15e93d"} Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.378027 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b03adc1c-f52f-4170-8ba0-d8d24da99bb9","Type":"ContainerStarted","Data":"963086bffca8b87b5aa065a50d26960d1b48ed6fc2610c4e34d4c896793ae1e4"} Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.381115 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b9b49fb9-6mlqm" event={"ID":"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0","Type":"ContainerStarted","Data":"da8633e0b900558b54e4bb9d959f7e5466fd8d39159450a3f431c56f2db4c51e"} Dec 04 15:39:07 crc kubenswrapper[4676]: I1204 15:39:07.414288 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-hnngv" podStartSLOduration=7.414251424 podStartE2EDuration="7.414251424s" podCreationTimestamp="2025-12-04 15:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:39:07.406474258 +0000 UTC m=+1154.841144125" watchObservedRunningTime="2025-12-04 15:39:07.414251424 +0000 UTC m=+1154.848921281" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.312862 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.446807 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-sb\") pod \"214010a3-d12a-4ff2-94b9-c0613c81d389\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.446857 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-nb\") pod \"214010a3-d12a-4ff2-94b9-c0613c81d389\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.446889 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-svc\") pod \"214010a3-d12a-4ff2-94b9-c0613c81d389\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.446947 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs8vv\" (UniqueName: \"kubernetes.io/projected/214010a3-d12a-4ff2-94b9-c0613c81d389-kube-api-access-rs8vv\") pod \"214010a3-d12a-4ff2-94b9-c0613c81d389\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.446986 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-swift-storage-0\") pod \"214010a3-d12a-4ff2-94b9-c0613c81d389\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.447451 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-config\") pod \"214010a3-d12a-4ff2-94b9-c0613c81d389\" (UID: \"214010a3-d12a-4ff2-94b9-c0613c81d389\") " Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.451225 4676 generic.go:334] "Generic (PLEG): container finished" podID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerID="75822f86c899f749943237accfe41642dd9812954e385868cb951c2a685ef826" exitCode=0 Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.451349 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" event={"ID":"3ed7fb0d-bb13-44f2-9e12-fe5829c660af","Type":"ContainerDied","Data":"75822f86c899f749943237accfe41642dd9812954e385868cb951c2a685ef826"} Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.499397 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b03adc1c-f52f-4170-8ba0-d8d24da99bb9","Type":"ContainerStarted","Data":"f1bb5409ccc6dc4915f09e70a63bcef34cb3bff7b54d2170b8435a7a753a6413"} Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.499620 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api-log" containerID="cri-o://963086bffca8b87b5aa065a50d26960d1b48ed6fc2610c4e34d4c896793ae1e4" gracePeriod=30 Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.499818 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" containerID="cri-o://f1bb5409ccc6dc4915f09e70a63bcef34cb3bff7b54d2170b8435a7a753a6413" gracePeriod=30 Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.499995 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.509310 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214010a3-d12a-4ff2-94b9-c0613c81d389-kube-api-access-rs8vv" (OuterVolumeSpecName: "kube-api-access-rs8vv") pod "214010a3-d12a-4ff2-94b9-c0613c81d389" (UID: "214010a3-d12a-4ff2-94b9-c0613c81d389"). InnerVolumeSpecName "kube-api-access-rs8vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.536603 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" event={"ID":"214010a3-d12a-4ff2-94b9-c0613c81d389","Type":"ContainerDied","Data":"c8c6e1fef27f8f019e35d7dab85beb284352963b16ed982cb1a6e261f21f1157"} Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.536693 4676 scope.go:117] "RemoveContainer" containerID="6c1fb78f3d3fa352efb2e7728be5d8944bce7fc39dd7320622f57bd18a8ce6ff" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.572849 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "214010a3-d12a-4ff2-94b9-c0613c81d389" (UID: "214010a3-d12a-4ff2-94b9-c0613c81d389"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.581809 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-854f4d7cbc-mkbcm" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.603821 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=6.603799336 podStartE2EDuration="6.603799336s" podCreationTimestamp="2025-12-04 15:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:39:08.559499089 +0000 UTC m=+1155.994168946" watchObservedRunningTime="2025-12-04 15:39:08.603799336 +0000 UTC m=+1156.038469193" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.605711 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.605767 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs8vv\" (UniqueName: \"kubernetes.io/projected/214010a3-d12a-4ff2-94b9-c0613c81d389-kube-api-access-rs8vv\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.610225 4676 generic.go:334] "Generic (PLEG): container finished" podID="0a7820f5-8870-4da3-8576-328966fdc552" containerID="75ac6da838127e7ec899ecc0d54e089850e02abf2537c18b4c89930e36b0566a" exitCode=0 Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.610555 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0109-account-create-zfgjz" event={"ID":"0a7820f5-8870-4da3-8576-328966fdc552","Type":"ContainerDied","Data":"75ac6da838127e7ec899ecc0d54e089850e02abf2537c18b4c89930e36b0566a"} Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.634970 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.648878 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "214010a3-d12a-4ff2-94b9-c0613c81d389" (UID: "214010a3-d12a-4ff2-94b9-c0613c81d389"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.680969 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "214010a3-d12a-4ff2-94b9-c0613c81d389" (UID: "214010a3-d12a-4ff2-94b9-c0613c81d389"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.689443 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-config" (OuterVolumeSpecName: "config") pod "214010a3-d12a-4ff2-94b9-c0613c81d389" (UID: "214010a3-d12a-4ff2-94b9-c0613c81d389"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.692700 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "214010a3-d12a-4ff2-94b9-c0613c81d389" (UID: "214010a3-d12a-4ff2-94b9-c0613c81d389"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.724954 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.724991 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.736791 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.736866 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/214010a3-d12a-4ff2-94b9-c0613c81d389-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:08 crc kubenswrapper[4676]: I1204 15:39:08.818612 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": EOF" Dec 04 15:39:09 crc kubenswrapper[4676]: I1204 15:39:09.256966 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": EOF" Dec 04 15:39:09 crc kubenswrapper[4676]: I1204 15:39:09.278736 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-854f4d7cbc-mkbcm"] Dec 04 15:39:09 crc kubenswrapper[4676]: I1204 15:39:09.287324 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-854f4d7cbc-mkbcm"] Dec 04 15:39:09 crc kubenswrapper[4676]: I1204 15:39:09.540899 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214010a3-d12a-4ff2-94b9-c0613c81d389" path="/var/lib/kubelet/pods/214010a3-d12a-4ff2-94b9-c0613c81d389/volumes" Dec 04 15:39:09 crc kubenswrapper[4676]: I1204 15:39:09.653206 4676 generic.go:334] "Generic (PLEG): container finished" podID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerID="963086bffca8b87b5aa065a50d26960d1b48ed6fc2610c4e34d4c896793ae1e4" exitCode=143 Dec 04 15:39:09 crc kubenswrapper[4676]: I1204 15:39:09.653274 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b03adc1c-f52f-4170-8ba0-d8d24da99bb9","Type":"ContainerDied","Data":"963086bffca8b87b5aa065a50d26960d1b48ed6fc2610c4e34d4c896793ae1e4"} Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.856169 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8f696d8d9-98tv4"] Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.907851 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78c887c44-wcq82"] Dec 04 15:39:10 crc kubenswrapper[4676]: E1204 15:39:10.908417 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214010a3-d12a-4ff2-94b9-c0613c81d389" containerName="init" Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.908450 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="214010a3-d12a-4ff2-94b9-c0613c81d389" containerName="init" Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.908707 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="214010a3-d12a-4ff2-94b9-c0613c81d389" containerName="init" Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.910134 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.921423 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.927045 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78c887c44-wcq82"] Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.958941 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b9b49fb9-6mlqm"] Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.992864 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74857cd458-nnlq7"] Dec 04 15:39:10 crc kubenswrapper[4676]: I1204 15:39:10.995114 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.026962 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74857cd458-nnlq7"] Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.028434 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-secret-key\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.028496 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-tls-certs\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.028536 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-scripts\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.028587 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-config-data\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.028605 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f68f12a3-a61b-492b-94e9-4351419cfa7b-logs\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.028649 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-combined-ca-bundle\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.028723 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xjv2\" (UniqueName: \"kubernetes.io/projected/f68f12a3-a61b-492b-94e9-4351419cfa7b-kube-api-access-8xjv2\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131201 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-secret-key\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131325 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twf2h\" (UniqueName: \"kubernetes.io/projected/062c032e-aef9-4036-8d2b-dc89641ed977-kube-api-access-twf2h\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131365 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-tls-certs\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131437 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-scripts\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131539 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-config-data\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131587 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f68f12a3-a61b-492b-94e9-4351419cfa7b-logs\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131648 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-horizon-tls-certs\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131689 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-horizon-secret-key\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131742 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/062c032e-aef9-4036-8d2b-dc89641ed977-scripts\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131778 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-combined-ca-bundle\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.131833 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-combined-ca-bundle\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.132008 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xjv2\" (UniqueName: \"kubernetes.io/projected/f68f12a3-a61b-492b-94e9-4351419cfa7b-kube-api-access-8xjv2\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.132065 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/062c032e-aef9-4036-8d2b-dc89641ed977-logs\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.132148 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/062c032e-aef9-4036-8d2b-dc89641ed977-config-data\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.133922 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-scripts\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.135320 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-config-data\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.135627 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f68f12a3-a61b-492b-94e9-4351419cfa7b-logs\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.137323 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-secret-key\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.138800 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-combined-ca-bundle\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.157566 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xjv2\" (UniqueName: \"kubernetes.io/projected/f68f12a3-a61b-492b-94e9-4351419cfa7b-kube-api-access-8xjv2\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.170787 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-tls-certs\") pod \"horizon-78c887c44-wcq82\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.241146 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-horizon-tls-certs\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.241202 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-horizon-secret-key\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.241224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/062c032e-aef9-4036-8d2b-dc89641ed977-scripts\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.241249 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-combined-ca-bundle\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.241348 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/062c032e-aef9-4036-8d2b-dc89641ed977-logs\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.241533 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/062c032e-aef9-4036-8d2b-dc89641ed977-config-data\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.241627 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twf2h\" (UniqueName: \"kubernetes.io/projected/062c032e-aef9-4036-8d2b-dc89641ed977-kube-api-access-twf2h\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.242130 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/062c032e-aef9-4036-8d2b-dc89641ed977-scripts\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.244726 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-horizon-tls-certs\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.245562 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-combined-ca-bundle\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.246115 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/062c032e-aef9-4036-8d2b-dc89641ed977-config-data\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.252583 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/062c032e-aef9-4036-8d2b-dc89641ed977-horizon-secret-key\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.252810 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/062c032e-aef9-4036-8d2b-dc89641ed977-logs\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.268849 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.276590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twf2h\" (UniqueName: \"kubernetes.io/projected/062c032e-aef9-4036-8d2b-dc89641ed977-kube-api-access-twf2h\") pod \"horizon-74857cd458-nnlq7\" (UID: \"062c032e-aef9-4036-8d2b-dc89641ed977\") " pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:11 crc kubenswrapper[4676]: I1204 15:39:11.327414 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:12 crc kubenswrapper[4676]: I1204 15:39:12.726571 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0109-account-create-zfgjz" event={"ID":"0a7820f5-8870-4da3-8576-328966fdc552","Type":"ContainerDied","Data":"faf15d514e1f0535b5d4819e64586b5835b7dc391d95a448130634e7d8fe1df1"} Dec 04 15:39:12 crc kubenswrapper[4676]: I1204 15:39:12.726882 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faf15d514e1f0535b5d4819e64586b5835b7dc391d95a448130634e7d8fe1df1" Dec 04 15:39:12 crc kubenswrapper[4676]: I1204 15:39:12.913814 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0109-account-create-zfgjz" Dec 04 15:39:12 crc kubenswrapper[4676]: I1204 15:39:12.931209 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2cec-account-create-fwvgn" Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.476054 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ltmm\" (UniqueName: \"kubernetes.io/projected/8e4af1e4-191b-483a-9886-f07cc9829079-kube-api-access-7ltmm\") pod \"8e4af1e4-191b-483a-9886-f07cc9829079\" (UID: \"8e4af1e4-191b-483a-9886-f07cc9829079\") " Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.476402 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67ndn\" (UniqueName: \"kubernetes.io/projected/0a7820f5-8870-4da3-8576-328966fdc552-kube-api-access-67ndn\") pod \"0a7820f5-8870-4da3-8576-328966fdc552\" (UID: \"0a7820f5-8870-4da3-8576-328966fdc552\") " Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.522283 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4af1e4-191b-483a-9886-f07cc9829079-kube-api-access-7ltmm" (OuterVolumeSpecName: "kube-api-access-7ltmm") pod "8e4af1e4-191b-483a-9886-f07cc9829079" (UID: "8e4af1e4-191b-483a-9886-f07cc9829079"). InnerVolumeSpecName "kube-api-access-7ltmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.530199 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7820f5-8870-4da3-8576-328966fdc552-kube-api-access-67ndn" (OuterVolumeSpecName: "kube-api-access-67ndn") pod "0a7820f5-8870-4da3-8576-328966fdc552" (UID: "0a7820f5-8870-4da3-8576-328966fdc552"). InnerVolumeSpecName "kube-api-access-67ndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.584452 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ltmm\" (UniqueName: \"kubernetes.io/projected/8e4af1e4-191b-483a-9886-f07cc9829079-kube-api-access-7ltmm\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.584477 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67ndn\" (UniqueName: \"kubernetes.io/projected/0a7820f5-8870-4da3-8576-328966fdc552-kube-api-access-67ndn\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.739369 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0109-account-create-zfgjz" Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.739397 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2cec-account-create-fwvgn" Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.739443 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2cec-account-create-fwvgn" event={"ID":"8e4af1e4-191b-483a-9886-f07cc9829079","Type":"ContainerDied","Data":"2ca689f06de1a60255abbd33cb67fdd9d964fc5f6023c3923701c0e60ff06360"} Dec 04 15:39:13 crc kubenswrapper[4676]: I1204 15:39:13.739471 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca689f06de1a60255abbd33cb67fdd9d964fc5f6023c3923701c0e60ff06360" Dec 04 15:39:14 crc kubenswrapper[4676]: I1204 15:39:14.761672 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" event={"ID":"3ed7fb0d-bb13-44f2-9e12-fe5829c660af","Type":"ContainerStarted","Data":"dc83a9536d95eb7a61acfb9801cd70f396a2071a9fa4018f6c2f5123c8cbb9d0"} Dec 04 15:39:14 crc kubenswrapper[4676]: I1204 15:39:14.763042 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:14 crc kubenswrapper[4676]: I1204 15:39:14.790157 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" podStartSLOduration=13.790135496 podStartE2EDuration="13.790135496s" podCreationTimestamp="2025-12-04 15:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:39:14.785773699 +0000 UTC m=+1162.220443556" watchObservedRunningTime="2025-12-04 15:39:14.790135496 +0000 UTC m=+1162.224805353" Dec 04 15:39:15 crc kubenswrapper[4676]: I1204 15:39:15.167048 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": read tcp 10.217.0.2:52508->10.217.0.153:9322: read: connection reset by peer" Dec 04 15:39:15 crc kubenswrapper[4676]: I1204 15:39:15.775035 4676 generic.go:334] "Generic (PLEG): container finished" podID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerID="f1bb5409ccc6dc4915f09e70a63bcef34cb3bff7b54d2170b8435a7a753a6413" exitCode=0 Dec 04 15:39:15 crc kubenswrapper[4676]: I1204 15:39:15.775391 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b03adc1c-f52f-4170-8ba0-d8d24da99bb9","Type":"ContainerDied","Data":"f1bb5409ccc6dc4915f09e70a63bcef34cb3bff7b54d2170b8435a7a753a6413"} Dec 04 15:39:15 crc kubenswrapper[4676]: I1204 15:39:15.777147 4676 generic.go:334] "Generic (PLEG): container finished" podID="d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" containerID="11e68f22087cbef6fd18fafc8e8fc08b35bde36e595de05e4c3639967d15e93d" exitCode=0 Dec 04 15:39:15 crc kubenswrapper[4676]: I1204 15:39:15.778185 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnngv" event={"ID":"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f","Type":"ContainerDied","Data":"11e68f22087cbef6fd18fafc8e8fc08b35bde36e595de05e4c3639967d15e93d"} Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.945969 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pksjc"] Dec 04 15:39:16 crc kubenswrapper[4676]: E1204 15:39:16.946400 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e4af1e4-191b-483a-9886-f07cc9829079" containerName="mariadb-account-create" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.946415 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4af1e4-191b-483a-9886-f07cc9829079" containerName="mariadb-account-create" Dec 04 15:39:16 crc kubenswrapper[4676]: E1204 15:39:16.946454 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7820f5-8870-4da3-8576-328966fdc552" containerName="mariadb-account-create" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.946460 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7820f5-8870-4da3-8576-328966fdc552" containerName="mariadb-account-create" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.946669 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7820f5-8870-4da3-8576-328966fdc552" containerName="mariadb-account-create" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.946684 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e4af1e4-191b-483a-9886-f07cc9829079" containerName="mariadb-account-create" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.949061 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.951231 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.952380 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wtvdx" Dec 04 15:39:16 crc kubenswrapper[4676]: I1204 15:39:16.971390 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pksjc"] Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.061728 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-combined-ca-bundle\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.061824 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c86c\" (UniqueName: \"kubernetes.io/projected/3ac7518d-e354-42a9-85e4-766e455bf838-kube-api-access-8c86c\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.062235 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-config-data\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.062437 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-db-sync-config-data\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.165318 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-db-sync-config-data\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.165750 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-combined-ca-bundle\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.165801 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c86c\" (UniqueName: \"kubernetes.io/projected/3ac7518d-e354-42a9-85e4-766e455bf838-kube-api-access-8c86c\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.166082 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-config-data\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.173504 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-config-data\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.174399 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-combined-ca-bundle\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.185385 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-db-sync-config-data\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.188846 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c86c\" (UniqueName: \"kubernetes.io/projected/3ac7518d-e354-42a9-85e4-766e455bf838-kube-api-access-8c86c\") pod \"glance-db-sync-pksjc\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.277419 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pksjc" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.302395 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mxcxz"] Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.303897 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.306210 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.306451 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.306729 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sjgpz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.312632 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mxcxz"] Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.473952 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-combined-ca-bundle\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.474020 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-config\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.474066 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmxsp\" (UniqueName: \"kubernetes.io/projected/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-kube-api-access-gmxsp\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.576425 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-combined-ca-bundle\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.576547 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-config\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.576599 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmxsp\" (UniqueName: \"kubernetes.io/projected/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-kube-api-access-gmxsp\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.581619 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-config\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.582823 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-combined-ca-bundle\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.599106 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmxsp\" (UniqueName: \"kubernetes.io/projected/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-kube-api-access-gmxsp\") pod \"neutron-db-sync-mxcxz\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:17 crc kubenswrapper[4676]: I1204 15:39:17.636182 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:39:18 crc kubenswrapper[4676]: I1204 15:39:18.626488 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": dial tcp 10.217.0.153:9322: connect: connection refused" Dec 04 15:39:21 crc kubenswrapper[4676]: E1204 15:39:21.886585 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Dec 04 15:39:21 crc kubenswrapper[4676]: E1204 15:39:21.886894 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-placement-api:watcher_latest" Dec 04 15:39:21 crc kubenswrapper[4676]: E1204 15:39:21.887097 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:38.129.56.200:5001/podified-master-centos10/openstack-placement-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68b9f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-jlg26_openstack(89c93c13-31d1-4762-9457-90e32c63873e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:39:21 crc kubenswrapper[4676]: E1204 15:39:21.888271 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-jlg26" podUID="89c93c13-31d1-4762-9457-90e32c63873e" Dec 04 15:39:22 crc kubenswrapper[4676]: I1204 15:39:22.702975 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:39:22 crc kubenswrapper[4676]: I1204 15:39:22.764959 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779f74f7bf-7rrdz"] Dec 04 15:39:22 crc kubenswrapper[4676]: I1204 15:39:22.765290 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" containerID="cri-o://e13d9c782e905b3a608022c2a1f041ad10314ffb20c2253f1309fead73947429" gracePeriod=10 Dec 04 15:39:22 crc kubenswrapper[4676]: I1204 15:39:22.906195 4676 generic.go:334] "Generic (PLEG): container finished" podID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerID="e13d9c782e905b3a608022c2a1f041ad10314ffb20c2253f1309fead73947429" exitCode=0 Dec 04 15:39:22 crc kubenswrapper[4676]: I1204 15:39:22.906306 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" event={"ID":"99411ac6-aa35-4f96-bf75-783e3dcbdf93","Type":"ContainerDied","Data":"e13d9c782e905b3a608022c2a1f041ad10314ffb20c2253f1309fead73947429"} Dec 04 15:39:22 crc kubenswrapper[4676]: E1204 15:39:22.909291 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/podified-master-centos10/openstack-placement-api:watcher_latest\\\"\"" pod="openstack/placement-db-sync-jlg26" podUID="89c93c13-31d1-4762-9457-90e32c63873e" Dec 04 15:39:23 crc kubenswrapper[4676]: I1204 15:39:23.626380 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": dial tcp 10.217.0.153:9322: connect: connection refused" Dec 04 15:39:24 crc kubenswrapper[4676]: I1204 15:39:24.440883 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.239100 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.239446 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.239581 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n576h59fh59fh64dh57dhf5h577h97h645h598hd7h95hcch599h5b8hb4h66bh54h74h94hbh67bh54bhb8h5fch55fhfdhcch65ch64bh5b4h695q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdggp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-764d75d947-w4sq5_openstack(342b7993-6fce-4369-8a5c-ce88e185a83f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.242095 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-764d75d947-w4sq5" podUID="342b7993-6fce-4369-8a5c-ce88e185a83f" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.247442 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.247497 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.247619 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncch5b8h567hfbh55ch5fh7h564h547h689h5dfh68bh594h674h87h9fh58bh57h64h576h5c5h74h544h9ch646h558h65bhb7h5bdh659hbh657q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sx26z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b9b49fb9-6mlqm_openstack(1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.250156 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-6b9b49fb9-6mlqm" podUID="1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.279454 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.279525 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.279656 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n595h57bh656h79h586h544h648hd5h658h549h5d9h5ch577h698h664h564h598hcbh68ch594hc5h676h68bh56ch6ch567h58fhbch544hf5h6ch67fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:yes,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhkp6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8f696d8d9-98tv4_openstack(c53ed0cb-2204-41f6-8474-c4afb7b7048e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.281692 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/podified-master-centos10/openstack-horizon:watcher_latest\\\"\"]" pod="openstack/horizon-8f696d8d9-98tv4" podUID="c53ed0cb-2204-41f6-8474-c4afb7b7048e" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.329770 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.394997 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-scripts\") pod \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.395347 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-combined-ca-bundle\") pod \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.395420 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmsxf\" (UniqueName: \"kubernetes.io/projected/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-kube-api-access-tmsxf\") pod \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.395547 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-credential-keys\") pod \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.395652 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-fernet-keys\") pod \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.395727 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-config-data\") pod \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\" (UID: \"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f\") " Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.411495 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" (UID: "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.411638 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-scripts" (OuterVolumeSpecName: "scripts") pod "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" (UID: "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.411727 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-kube-api-access-tmsxf" (OuterVolumeSpecName: "kube-api-access-tmsxf") pod "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" (UID: "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f"). InnerVolumeSpecName "kube-api-access-tmsxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.424200 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" (UID: "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.429986 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" (UID: "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.459330 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-config-data" (OuterVolumeSpecName: "config-data") pod "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" (UID: "d24b191f-1bab-42bf-a9e6-a0aa6b4b881f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.497962 4676 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.498011 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.498024 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.498035 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.498051 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmsxf\" (UniqueName: \"kubernetes.io/projected/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-kube-api-access-tmsxf\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.498064 4676 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.824025 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.824320 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.824583 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.129.56.200:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7lbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-6b4sd_openstack(4feecc1c-e63e-4063-947d-4c2c619525a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.825821 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-6b4sd" podUID="4feecc1c-e63e-4063-947d-4c2c619525a7" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.964380 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-hnngv" event={"ID":"d24b191f-1bab-42bf-a9e6-a0aa6b4b881f","Type":"ContainerDied","Data":"d9d6e9c850df1a8f333c5216a7978234f6691ecc68a4d160f6b1ced28f6c300a"} Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.964419 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-hnngv" Dec 04 15:39:27 crc kubenswrapper[4676]: I1204 15:39:27.964434 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9d6e9c850df1a8f333c5216a7978234f6691ecc68a4d160f6b1ced28f6c300a" Dec 04 15:39:27 crc kubenswrapper[4676]: E1204 15:39:27.969666 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-6b4sd" podUID="4feecc1c-e63e-4063-947d-4c2c619525a7" Dec 04 15:39:28 crc kubenswrapper[4676]: E1204 15:39:28.385882 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Dec 04 15:39:28 crc kubenswrapper[4676]: E1204 15:39:28.386164 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Dec 04 15:39:28 crc kubenswrapper[4676]: E1204 15:39:28.386291 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.129.56.200:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fbh64bh646h579h587h667h68h5fh588h7bh65dhc8h656h5bch5b9h86h66hc7h67dh98h8h566hcch695h8h5cfh548h67fh8dh5f4h68fh5c8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9rxsr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6cfbf976-db77-44d0-9a80-83648d806eea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.449051 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-hnngv"] Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.456654 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-hnngv"] Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.532276 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-llvh8"] Dec 04 15:39:28 crc kubenswrapper[4676]: E1204 15:39:28.532812 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" containerName="keystone-bootstrap" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.532852 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" containerName="keystone-bootstrap" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.533154 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" containerName="keystone-bootstrap" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.534014 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.538972 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.540183 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.540312 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vsxn6" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.540192 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.549746 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-llvh8"] Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.637462 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dznx9\" (UniqueName: \"kubernetes.io/projected/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-kube-api-access-dznx9\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.637862 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-fernet-keys\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.638012 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-credential-keys\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.638282 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-config-data\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.638320 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-combined-ca-bundle\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.638465 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-scripts\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.740616 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-config-data\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.740677 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-combined-ca-bundle\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.740747 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-scripts\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.740815 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dznx9\" (UniqueName: \"kubernetes.io/projected/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-kube-api-access-dznx9\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.740858 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-fernet-keys\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.740888 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-credential-keys\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.746156 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-combined-ca-bundle\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.746256 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-config-data\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.747431 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-credential-keys\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.749278 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-scripts\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.749786 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-fernet-keys\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.760665 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dznx9\" (UniqueName: \"kubernetes.io/projected/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-kube-api-access-dznx9\") pod \"keystone-bootstrap-llvh8\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:28 crc kubenswrapper[4676]: I1204 15:39:28.869242 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:29 crc kubenswrapper[4676]: I1204 15:39:29.408333 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24b191f-1bab-42bf-a9e6-a0aa6b4b881f" path="/var/lib/kubelet/pods/d24b191f-1bab-42bf-a9e6-a0aa6b4b881f/volumes" Dec 04 15:39:29 crc kubenswrapper[4676]: I1204 15:39:29.440333 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Dec 04 15:39:33 crc kubenswrapper[4676]: I1204 15:39:33.626659 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:39:34 crc kubenswrapper[4676]: I1204 15:39:34.440400 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Dec 04 15:39:34 crc kubenswrapper[4676]: I1204 15:39:34.440836 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:39:38 crc kubenswrapper[4676]: I1204 15:39:38.627736 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.314436 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.324113 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.332645 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.355062 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.375457 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sx26z\" (UniqueName: \"kubernetes.io/projected/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-kube-api-access-sx26z\") pod \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.375516 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-custom-prometheus-ca\") pod \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.375563 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-horizon-secret-key\") pod \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.375658 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-config-data\") pod \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.375701 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4sb7\" (UniqueName: \"kubernetes.io/projected/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-kube-api-access-t4sb7\") pod \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.376585 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-scripts" (OuterVolumeSpecName: "scripts") pod "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0" (UID: "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.376713 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-config-data" (OuterVolumeSpecName: "config-data") pod "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0" (UID: "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.376744 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-scripts\") pod \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.376818 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-logs\") pod \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.376872 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-combined-ca-bundle\") pod \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.376932 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-config-data\") pod \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\" (UID: \"b03adc1c-f52f-4170-8ba0-d8d24da99bb9\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.377180 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-logs\") pod \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\" (UID: \"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.377871 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-logs" (OuterVolumeSpecName: "logs") pod "b03adc1c-f52f-4170-8ba0-d8d24da99bb9" (UID: "b03adc1c-f52f-4170-8ba0-d8d24da99bb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.380178 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.380215 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.380227 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.380700 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-logs" (OuterVolumeSpecName: "logs") pod "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0" (UID: "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.392149 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-kube-api-access-sx26z" (OuterVolumeSpecName: "kube-api-access-sx26z") pod "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0" (UID: "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0"). InnerVolumeSpecName "kube-api-access-sx26z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.394073 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0" (UID: "1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.415127 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-kube-api-access-t4sb7" (OuterVolumeSpecName: "kube-api-access-t4sb7") pod "b03adc1c-f52f-4170-8ba0-d8d24da99bb9" (UID: "b03adc1c-f52f-4170-8ba0-d8d24da99bb9"). InnerVolumeSpecName "kube-api-access-t4sb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.430072 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b03adc1c-f52f-4170-8ba0-d8d24da99bb9" (UID: "b03adc1c-f52f-4170-8ba0-d8d24da99bb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.481850 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdggp\" (UniqueName: \"kubernetes.io/projected/342b7993-6fce-4369-8a5c-ce88e185a83f-kube-api-access-bdggp\") pod \"342b7993-6fce-4369-8a5c-ce88e185a83f\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482239 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c53ed0cb-2204-41f6-8474-c4afb7b7048e-horizon-secret-key\") pod \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482404 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhkp6\" (UniqueName: \"kubernetes.io/projected/c53ed0cb-2204-41f6-8474-c4afb7b7048e-kube-api-access-vhkp6\") pod \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482452 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-scripts\") pod \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482518 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b7993-6fce-4369-8a5c-ce88e185a83f-horizon-secret-key\") pod \"342b7993-6fce-4369-8a5c-ce88e185a83f\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482543 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-config-data\") pod \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482567 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b7993-6fce-4369-8a5c-ce88e185a83f-logs\") pod \"342b7993-6fce-4369-8a5c-ce88e185a83f\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482641 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-scripts\") pod \"342b7993-6fce-4369-8a5c-ce88e185a83f\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482705 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-config-data\") pod \"342b7993-6fce-4369-8a5c-ce88e185a83f\" (UID: \"342b7993-6fce-4369-8a5c-ce88e185a83f\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.482735 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c53ed0cb-2204-41f6-8474-c4afb7b7048e-logs\") pod \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\" (UID: \"c53ed0cb-2204-41f6-8474-c4afb7b7048e\") " Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483310 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483338 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4sb7\" (UniqueName: \"kubernetes.io/projected/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-kube-api-access-t4sb7\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483351 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483362 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483373 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sx26z\" (UniqueName: \"kubernetes.io/projected/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0-kube-api-access-sx26z\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483607 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-config-data" (OuterVolumeSpecName: "config-data") pod "c53ed0cb-2204-41f6-8474-c4afb7b7048e" (UID: "c53ed0cb-2204-41f6-8474-c4afb7b7048e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483720 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c53ed0cb-2204-41f6-8474-c4afb7b7048e-logs" (OuterVolumeSpecName: "logs") pod "c53ed0cb-2204-41f6-8474-c4afb7b7048e" (UID: "c53ed0cb-2204-41f6-8474-c4afb7b7048e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.484252 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-scripts" (OuterVolumeSpecName: "scripts") pod "342b7993-6fce-4369-8a5c-ce88e185a83f" (UID: "342b7993-6fce-4369-8a5c-ce88e185a83f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.485048 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-config-data" (OuterVolumeSpecName: "config-data") pod "342b7993-6fce-4369-8a5c-ce88e185a83f" (UID: "342b7993-6fce-4369-8a5c-ce88e185a83f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.485539 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/342b7993-6fce-4369-8a5c-ce88e185a83f-logs" (OuterVolumeSpecName: "logs") pod "342b7993-6fce-4369-8a5c-ce88e185a83f" (UID: "342b7993-6fce-4369-8a5c-ce88e185a83f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.483167 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-scripts" (OuterVolumeSpecName: "scripts") pod "c53ed0cb-2204-41f6-8474-c4afb7b7048e" (UID: "c53ed0cb-2204-41f6-8474-c4afb7b7048e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.488718 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b03adc1c-f52f-4170-8ba0-d8d24da99bb9" (UID: "b03adc1c-f52f-4170-8ba0-d8d24da99bb9"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.488726 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/342b7993-6fce-4369-8a5c-ce88e185a83f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "342b7993-6fce-4369-8a5c-ce88e185a83f" (UID: "342b7993-6fce-4369-8a5c-ce88e185a83f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.488761 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/342b7993-6fce-4369-8a5c-ce88e185a83f-kube-api-access-bdggp" (OuterVolumeSpecName: "kube-api-access-bdggp") pod "342b7993-6fce-4369-8a5c-ce88e185a83f" (UID: "342b7993-6fce-4369-8a5c-ce88e185a83f"). InnerVolumeSpecName "kube-api-access-bdggp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.491073 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53ed0cb-2204-41f6-8474-c4afb7b7048e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c53ed0cb-2204-41f6-8474-c4afb7b7048e" (UID: "c53ed0cb-2204-41f6-8474-c4afb7b7048e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.491744 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53ed0cb-2204-41f6-8474-c4afb7b7048e-kube-api-access-vhkp6" (OuterVolumeSpecName: "kube-api-access-vhkp6") pod "c53ed0cb-2204-41f6-8474-c4afb7b7048e" (UID: "c53ed0cb-2204-41f6-8474-c4afb7b7048e"). InnerVolumeSpecName "kube-api-access-vhkp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.519021 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-config-data" (OuterVolumeSpecName: "config-data") pod "b03adc1c-f52f-4170-8ba0-d8d24da99bb9" (UID: "b03adc1c-f52f-4170-8ba0-d8d24da99bb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586012 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c53ed0cb-2204-41f6-8474-c4afb7b7048e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586084 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhkp6\" (UniqueName: \"kubernetes.io/projected/c53ed0cb-2204-41f6-8474-c4afb7b7048e-kube-api-access-vhkp6\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586107 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586124 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/342b7993-6fce-4369-8a5c-ce88e185a83f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586167 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c53ed0cb-2204-41f6-8474-c4afb7b7048e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586180 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/342b7993-6fce-4369-8a5c-ce88e185a83f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586191 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586203 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586214 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/342b7993-6fce-4369-8a5c-ce88e185a83f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586255 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c53ed0cb-2204-41f6-8474-c4afb7b7048e-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586266 4676 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b03adc1c-f52f-4170-8ba0-d8d24da99bb9-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:39 crc kubenswrapper[4676]: I1204 15:39:39.586278 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdggp\" (UniqueName: \"kubernetes.io/projected/342b7993-6fce-4369-8a5c-ce88e185a83f-kube-api-access-bdggp\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.083734 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b03adc1c-f52f-4170-8ba0-d8d24da99bb9","Type":"ContainerDied","Data":"69c687050a25a19c894e9bb0cdc038d6e977abdb0683f5db8e7afddd122a9f60"} Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.083756 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.083795 4676 scope.go:117] "RemoveContainer" containerID="f1bb5409ccc6dc4915f09e70a63bcef34cb3bff7b54d2170b8435a7a753a6413" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.086168 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-764d75d947-w4sq5" event={"ID":"342b7993-6fce-4369-8a5c-ce88e185a83f","Type":"ContainerDied","Data":"489f385f312169b1a8c478875f46b3ecce88fcb8312ea3e90c85a1fffff70feb"} Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.086186 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-764d75d947-w4sq5" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.088809 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b9b49fb9-6mlqm" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.088826 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b9b49fb9-6mlqm" event={"ID":"1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0","Type":"ContainerDied","Data":"da8633e0b900558b54e4bb9d959f7e5466fd8d39159450a3f431c56f2db4c51e"} Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.091300 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8f696d8d9-98tv4" event={"ID":"c53ed0cb-2204-41f6-8474-c4afb7b7048e","Type":"ContainerDied","Data":"00857035fd7485983bae741b2ebbab80eb37606bab6587fceb3ce1a43dbd6014"} Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.091399 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8f696d8d9-98tv4" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.121505 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.142667 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.164344 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:40 crc kubenswrapper[4676]: E1204 15:39:40.164711 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api-log" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.164722 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api-log" Dec 04 15:39:40 crc kubenswrapper[4676]: E1204 15:39:40.164733 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.164739 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.164942 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api-log" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.164962 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.165944 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.173644 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.176444 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-764d75d947-w4sq5"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.206977 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-764d75d947-w4sq5"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.213024 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.244747 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b9b49fb9-6mlqm"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.252361 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b9b49fb9-6mlqm"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.285439 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8f696d8d9-98tv4"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.297872 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8f696d8d9-98tv4"] Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.309438 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-config-data\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.309494 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdwlf\" (UniqueName: \"kubernetes.io/projected/30695e65-6a6b-4fd3-b913-592efbdb6e59-kube-api-access-hdwlf\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.309607 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.309840 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30695e65-6a6b-4fd3-b913-592efbdb6e59-logs\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.310088 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.417636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.417741 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30695e65-6a6b-4fd3-b913-592efbdb6e59-logs\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.417786 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.417953 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-config-data\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.417977 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdwlf\" (UniqueName: \"kubernetes.io/projected/30695e65-6a6b-4fd3-b913-592efbdb6e59-kube-api-access-hdwlf\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.418775 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30695e65-6a6b-4fd3-b913-592efbdb6e59-logs\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.423274 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.423478 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.424890 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-config-data\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.439190 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdwlf\" (UniqueName: \"kubernetes.io/projected/30695e65-6a6b-4fd3-b913-592efbdb6e59-kube-api-access-hdwlf\") pod \"watcher-api-0\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " pod="openstack/watcher-api-0" Dec 04 15:39:40 crc kubenswrapper[4676]: I1204 15:39:40.489596 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.399064 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0" path="/var/lib/kubelet/pods/1ad42d36-c6d4-4145-bfb1-c91bf3ca64c0/volumes" Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.399562 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="342b7993-6fce-4369-8a5c-ce88e185a83f" path="/var/lib/kubelet/pods/342b7993-6fce-4369-8a5c-ce88e185a83f/volumes" Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.400178 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" path="/var/lib/kubelet/pods/b03adc1c-f52f-4170-8ba0-d8d24da99bb9/volumes" Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.403807 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53ed0cb-2204-41f6-8474-c4afb7b7048e" path="/var/lib/kubelet/pods/c53ed0cb-2204-41f6-8474-c4afb7b7048e/volumes" Dec 04 15:39:41 crc kubenswrapper[4676]: E1204 15:39:41.757098 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 04 15:39:41 crc kubenswrapper[4676]: E1204 15:39:41.757154 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Dec 04 15:39:41 crc kubenswrapper[4676]: E1204 15:39:41.757279 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.129.56.200:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wq29l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-nnr52_openstack(c8534e22-ee3e-4b6c-92a8-1790b69f335d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:39:41 crc kubenswrapper[4676]: E1204 15:39:41.762347 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-nnr52" podUID="c8534e22-ee3e-4b6c-92a8-1790b69f335d" Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.845011 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.946116 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-sb\") pod \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.946213 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-config\") pod \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.946288 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-svc\") pod \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.946402 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-nb\") pod \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.946441 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-swift-storage-0\") pod \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.946477 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28lpk\" (UniqueName: \"kubernetes.io/projected/99411ac6-aa35-4f96-bf75-783e3dcbdf93-kube-api-access-28lpk\") pod \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\" (UID: \"99411ac6-aa35-4f96-bf75-783e3dcbdf93\") " Dec 04 15:39:41 crc kubenswrapper[4676]: I1204 15:39:41.952846 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99411ac6-aa35-4f96-bf75-783e3dcbdf93-kube-api-access-28lpk" (OuterVolumeSpecName: "kube-api-access-28lpk") pod "99411ac6-aa35-4f96-bf75-783e3dcbdf93" (UID: "99411ac6-aa35-4f96-bf75-783e3dcbdf93"). InnerVolumeSpecName "kube-api-access-28lpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.000400 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99411ac6-aa35-4f96-bf75-783e3dcbdf93" (UID: "99411ac6-aa35-4f96-bf75-783e3dcbdf93"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.003099 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99411ac6-aa35-4f96-bf75-783e3dcbdf93" (UID: "99411ac6-aa35-4f96-bf75-783e3dcbdf93"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.009173 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-config" (OuterVolumeSpecName: "config") pod "99411ac6-aa35-4f96-bf75-783e3dcbdf93" (UID: "99411ac6-aa35-4f96-bf75-783e3dcbdf93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.026617 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99411ac6-aa35-4f96-bf75-783e3dcbdf93" (UID: "99411ac6-aa35-4f96-bf75-783e3dcbdf93"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.030231 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99411ac6-aa35-4f96-bf75-783e3dcbdf93" (UID: "99411ac6-aa35-4f96-bf75-783e3dcbdf93"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.048446 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.048506 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.048524 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.048536 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28lpk\" (UniqueName: \"kubernetes.io/projected/99411ac6-aa35-4f96-bf75-783e3dcbdf93-kube-api-access-28lpk\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.048549 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.048561 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99411ac6-aa35-4f96-bf75-783e3dcbdf93-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.116510 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.117350 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" event={"ID":"99411ac6-aa35-4f96-bf75-783e3dcbdf93","Type":"ContainerDied","Data":"8c87540b5749f74f14853c9c9901bec81d09a866b8a7ae7fd59f0e6724cbb36a"} Dec 04 15:39:42 crc kubenswrapper[4676]: E1204 15:39:42.120383 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-nnr52" podUID="c8534e22-ee3e-4b6c-92a8-1790b69f335d" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.140952 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74857cd458-nnlq7"] Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.167052 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-779f74f7bf-7rrdz"] Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.176822 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-779f74f7bf-7rrdz"] Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.233132 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78c887c44-wcq82"] Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.236965 4676 scope.go:117] "RemoveContainer" containerID="963086bffca8b87b5aa065a50d26960d1b48ed6fc2610c4e34d4c896793ae1e4" Dec 04 15:39:42 crc kubenswrapper[4676]: W1204 15:39:42.241601 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod062c032e_aef9_4036_8d2b_dc89641ed977.slice/crio-b2c82a02b72e05464f7aefe87aee2ac43e49317a7c526842d4236d36e5664694 WatchSource:0}: Error finding container b2c82a02b72e05464f7aefe87aee2ac43e49317a7c526842d4236d36e5664694: Status 404 returned error can't find the container with id b2c82a02b72e05464f7aefe87aee2ac43e49317a7c526842d4236d36e5664694 Dec 04 15:39:42 crc kubenswrapper[4676]: W1204 15:39:42.301626 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf68f12a3_a61b_492b_94e9_4351419cfa7b.slice/crio-3313912cb7088c955042933763f209091d3fffc4985c84e3a203790365256d22 WatchSource:0}: Error finding container 3313912cb7088c955042933763f209091d3fffc4985c84e3a203790365256d22: Status 404 returned error can't find the container with id 3313912cb7088c955042933763f209091d3fffc4985c84e3a203790365256d22 Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.329845 4676 scope.go:117] "RemoveContainer" containerID="e13d9c782e905b3a608022c2a1f041ad10314ffb20c2253f1309fead73947429" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.341702 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mxcxz"] Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.435375 4676 scope.go:117] "RemoveContainer" containerID="fbac04c0072863afb5ca283bcd18ae655a3c53349c79e66182f093463d9d5596" Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.831548 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.843656 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-llvh8"] Dec 04 15:39:42 crc kubenswrapper[4676]: I1204 15:39:42.969164 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pksjc"] Dec 04 15:39:42 crc kubenswrapper[4676]: W1204 15:39:42.975008 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ac7518d_e354_42a9_85e4_766e455bf838.slice/crio-09e6adb64cd5040941c8ddb57141b046f50d8dda7cf5e42fb420616d9a8cc64b WatchSource:0}: Error finding container 09e6adb64cd5040941c8ddb57141b046f50d8dda7cf5e42fb420616d9a8cc64b: Status 404 returned error can't find the container with id 09e6adb64cd5040941c8ddb57141b046f50d8dda7cf5e42fb420616d9a8cc64b Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.143682 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pksjc" event={"ID":"3ac7518d-e354-42a9-85e4-766e455bf838","Type":"ContainerStarted","Data":"09e6adb64cd5040941c8ddb57141b046f50d8dda7cf5e42fb420616d9a8cc64b"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.147180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mxcxz" event={"ID":"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997","Type":"ContainerStarted","Data":"3824cedf3821404ecaa93361a03f6ca90e326fcb663133d0b9765ae49aef9e60"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.147268 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mxcxz" event={"ID":"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997","Type":"ContainerStarted","Data":"be48d26610ec7c0bb1baf5c0ea2e5a3e66bdd9c2f0800eb22dc7143aa4fa1bbb"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.155177 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c887c44-wcq82" event={"ID":"f68f12a3-a61b-492b-94e9-4351419cfa7b","Type":"ContainerStarted","Data":"3313912cb7088c955042933763f209091d3fffc4985c84e3a203790365256d22"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.162237 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"30695e65-6a6b-4fd3-b913-592efbdb6e59","Type":"ContainerStarted","Data":"920e42e2f6d1710ff7746d4e8c9fd2a421138a0fd5e72681a7cefcf8ea903df6"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.181039 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llvh8" event={"ID":"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9","Type":"ContainerStarted","Data":"cf0fd718398c0fd6ca6178cf80dbebc069329efe9e2732dc7d0757afc8a93607"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.182019 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mxcxz" podStartSLOduration=26.181998579 podStartE2EDuration="26.181998579s" podCreationTimestamp="2025-12-04 15:39:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:39:43.165742474 +0000 UTC m=+1190.600412331" watchObservedRunningTime="2025-12-04 15:39:43.181998579 +0000 UTC m=+1190.616668436" Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.183372 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerStarted","Data":"06ec8a0508c5113e48edf514194942f4a4010038009e62e14430ef46574abe17"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.185458 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74857cd458-nnlq7" event={"ID":"062c032e-aef9-4036-8d2b-dc89641ed977","Type":"ContainerStarted","Data":"b2c82a02b72e05464f7aefe87aee2ac43e49317a7c526842d4236d36e5664694"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.193435 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"aefbcd15-a508-4c33-9e9a-1e98106e3949","Type":"ContainerStarted","Data":"5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537"} Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.230091 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=19.20147396 podStartE2EDuration="41.230066123s" podCreationTimestamp="2025-12-04 15:39:02 +0000 UTC" firstStartedPulling="2025-12-04 15:39:05.780849317 +0000 UTC m=+1153.215519174" lastFinishedPulling="2025-12-04 15:39:27.80944148 +0000 UTC m=+1175.244111337" observedRunningTime="2025-12-04 15:39:43.20225399 +0000 UTC m=+1190.636923847" watchObservedRunningTime="2025-12-04 15:39:43.230066123 +0000 UTC m=+1190.664735980" Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.428180 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" path="/var/lib/kubelet/pods/99411ac6-aa35-4f96-bf75-783e3dcbdf93/volumes" Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.436792 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=18.885614721 podStartE2EDuration="41.436768208s" podCreationTimestamp="2025-12-04 15:39:02 +0000 UTC" firstStartedPulling="2025-12-04 15:39:05.813485575 +0000 UTC m=+1153.248155432" lastFinishedPulling="2025-12-04 15:39:28.364639062 +0000 UTC m=+1175.799308919" observedRunningTime="2025-12-04 15:39:43.226075276 +0000 UTC m=+1190.660745133" watchObservedRunningTime="2025-12-04 15:39:43.436768208 +0000 UTC m=+1190.871438065" Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.610820 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.611026 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.628403 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="b03adc1c-f52f-4170-8ba0-d8d24da99bb9" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.153:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:39:43 crc kubenswrapper[4676]: I1204 15:39:43.798262 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.227097 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"30695e65-6a6b-4fd3-b913-592efbdb6e59","Type":"ContainerStarted","Data":"f9e9d71737fff903ea0b44cae1b98a8c216ca28e67b50eca54687db73ad34cf6"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.227284 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"30695e65-6a6b-4fd3-b913-592efbdb6e59","Type":"ContainerStarted","Data":"4e16aacef709c51fca2b919af307e5a694d4f6297444f0e8223e61356d7f932b"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.229483 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.246493 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llvh8" event={"ID":"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9","Type":"ContainerStarted","Data":"b504620a44fd59ed7cfe1f1bb615ebcba66a9b4bce009c831026bbd1d75d22ad"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.261174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerStarted","Data":"79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.277731 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74857cd458-nnlq7" event={"ID":"062c032e-aef9-4036-8d2b-dc89641ed977","Type":"ContainerStarted","Data":"541d81a6186159e071dd5f7997b549c4826272aac8f45b4368fcdc319d51be89"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.277793 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74857cd458-nnlq7" event={"ID":"062c032e-aef9-4036-8d2b-dc89641ed977","Type":"ContainerStarted","Data":"d7424ed0eee7f7d59b908391bd623e9820240c33d10e370925ba106e59172aa5"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.280874 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=4.280853406 podStartE2EDuration="4.280853406s" podCreationTimestamp="2025-12-04 15:39:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:39:44.250301654 +0000 UTC m=+1191.684971541" watchObservedRunningTime="2025-12-04 15:39:44.280853406 +0000 UTC m=+1191.715523263" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.291516 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-llvh8" podStartSLOduration=16.291489426 podStartE2EDuration="16.291489426s" podCreationTimestamp="2025-12-04 15:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:39:44.277970892 +0000 UTC m=+1191.712640769" watchObservedRunningTime="2025-12-04 15:39:44.291489426 +0000 UTC m=+1191.726159283" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.296896 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6b4sd" event={"ID":"4feecc1c-e63e-4063-947d-4c2c619525a7","Type":"ContainerStarted","Data":"11c57be9a216605a8eb9cf338f53e60890a4725eff3cc5faa4e8d4b71e23302d"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.310155 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c887c44-wcq82" event={"ID":"f68f12a3-a61b-492b-94e9-4351419cfa7b","Type":"ContainerStarted","Data":"061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.310225 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c887c44-wcq82" event={"ID":"f68f12a3-a61b-492b-94e9-4351419cfa7b","Type":"ContainerStarted","Data":"b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.329865 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jlg26" event={"ID":"89c93c13-31d1-4762-9457-90e32c63873e","Type":"ContainerStarted","Data":"c7400cddab5773a1ef1b0b5b07a00195620e9e0bb5906b8e89c03029e7620bef"} Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.338150 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-74857cd458-nnlq7" podStartSLOduration=34.03819654 podStartE2EDuration="34.338127098s" podCreationTimestamp="2025-12-04 15:39:10 +0000 UTC" firstStartedPulling="2025-12-04 15:39:42.291390064 +0000 UTC m=+1189.726059921" lastFinishedPulling="2025-12-04 15:39:42.591320622 +0000 UTC m=+1190.025990479" observedRunningTime="2025-12-04 15:39:44.302053555 +0000 UTC m=+1191.736723412" watchObservedRunningTime="2025-12-04 15:39:44.338127098 +0000 UTC m=+1191.772796955" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.357542 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-6b4sd" podStartSLOduration=6.602609596 podStartE2EDuration="43.357519084s" podCreationTimestamp="2025-12-04 15:39:01 +0000 UTC" firstStartedPulling="2025-12-04 15:39:05.750015931 +0000 UTC m=+1153.184685788" lastFinishedPulling="2025-12-04 15:39:42.504925419 +0000 UTC m=+1189.939595276" observedRunningTime="2025-12-04 15:39:44.321258856 +0000 UTC m=+1191.755928713" watchObservedRunningTime="2025-12-04 15:39:44.357519084 +0000 UTC m=+1191.792188941" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.378033 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78c887c44-wcq82" podStartSLOduration=34.117172496 podStartE2EDuration="34.378009563s" podCreationTimestamp="2025-12-04 15:39:10 +0000 UTC" firstStartedPulling="2025-12-04 15:39:42.329823746 +0000 UTC m=+1189.764493603" lastFinishedPulling="2025-12-04 15:39:42.590660813 +0000 UTC m=+1190.025330670" observedRunningTime="2025-12-04 15:39:44.345197855 +0000 UTC m=+1191.779867722" watchObservedRunningTime="2025-12-04 15:39:44.378009563 +0000 UTC m=+1191.812679420" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.395872 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jlg26" podStartSLOduration=6.814780052 podStartE2EDuration="43.395847384s" podCreationTimestamp="2025-12-04 15:39:01 +0000 UTC" firstStartedPulling="2025-12-04 15:39:05.813774033 +0000 UTC m=+1153.248443880" lastFinishedPulling="2025-12-04 15:39:42.394841355 +0000 UTC m=+1189.829511212" observedRunningTime="2025-12-04 15:39:44.362409817 +0000 UTC m=+1191.797079684" watchObservedRunningTime="2025-12-04 15:39:44.395847384 +0000 UTC m=+1191.830517241" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.397161 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.446016 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-779f74f7bf-7rrdz" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: i/o timeout" Dec 04 15:39:44 crc kubenswrapper[4676]: I1204 15:39:44.459841 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:39:45 crc kubenswrapper[4676]: I1204 15:39:45.517305 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 15:39:46 crc kubenswrapper[4676]: I1204 15:39:46.362170 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" containerID="cri-o://5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" gracePeriod=30 Dec 04 15:39:46 crc kubenswrapper[4676]: I1204 15:39:46.362756 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:39:47 crc kubenswrapper[4676]: I1204 15:39:47.477570 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:39:48 crc kubenswrapper[4676]: E1204 15:39:48.618212 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:48 crc kubenswrapper[4676]: E1204 15:39:48.622076 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:48 crc kubenswrapper[4676]: E1204 15:39:48.625349 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:48 crc kubenswrapper[4676]: E1204 15:39:48.625407 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:39:49 crc kubenswrapper[4676]: I1204 15:39:49.351826 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.161:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:39:49 crc kubenswrapper[4676]: I1204 15:39:49.697180 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 15:39:50 crc kubenswrapper[4676]: I1204 15:39:50.490680 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 04 15:39:50 crc kubenswrapper[4676]: I1204 15:39:50.522761 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 04 15:39:50 crc kubenswrapper[4676]: I1204 15:39:50.650507 4676 generic.go:334] "Generic (PLEG): container finished" podID="89c93c13-31d1-4762-9457-90e32c63873e" containerID="c7400cddab5773a1ef1b0b5b07a00195620e9e0bb5906b8e89c03029e7620bef" exitCode=0 Dec 04 15:39:50 crc kubenswrapper[4676]: I1204 15:39:50.650576 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jlg26" event={"ID":"89c93c13-31d1-4762-9457-90e32c63873e","Type":"ContainerDied","Data":"c7400cddab5773a1ef1b0b5b07a00195620e9e0bb5906b8e89c03029e7620bef"} Dec 04 15:39:50 crc kubenswrapper[4676]: I1204 15:39:50.659370 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.270873 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.271396 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.328411 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.328727 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.661099 4676 generic.go:334] "Generic (PLEG): container finished" podID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerID="06ec8a0508c5113e48edf514194942f4a4010038009e62e14430ef46574abe17" exitCode=1 Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.661186 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerDied","Data":"06ec8a0508c5113e48edf514194942f4a4010038009e62e14430ef46574abe17"} Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.662119 4676 scope.go:117] "RemoveContainer" containerID="06ec8a0508c5113e48edf514194942f4a4010038009e62e14430ef46574abe17" Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.664637 4676 generic.go:334] "Generic (PLEG): container finished" podID="b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" containerID="b504620a44fd59ed7cfe1f1bb615ebcba66a9b4bce009c831026bbd1d75d22ad" exitCode=0 Dec 04 15:39:51 crc kubenswrapper[4676]: I1204 15:39:51.664773 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llvh8" event={"ID":"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9","Type":"ContainerDied","Data":"b504620a44fd59ed7cfe1f1bb615ebcba66a9b4bce009c831026bbd1d75d22ad"} Dec 04 15:39:52 crc kubenswrapper[4676]: I1204 15:39:52.681764 4676 generic.go:334] "Generic (PLEG): container finished" podID="4feecc1c-e63e-4063-947d-4c2c619525a7" containerID="11c57be9a216605a8eb9cf338f53e60890a4725eff3cc5faa4e8d4b71e23302d" exitCode=0 Dec 04 15:39:52 crc kubenswrapper[4676]: I1204 15:39:52.681838 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6b4sd" event={"ID":"4feecc1c-e63e-4063-947d-4c2c619525a7","Type":"ContainerDied","Data":"11c57be9a216605a8eb9cf338f53e60890a4725eff3cc5faa4e8d4b71e23302d"} Dec 04 15:39:52 crc kubenswrapper[4676]: I1204 15:39:52.901253 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:39:52 crc kubenswrapper[4676]: I1204 15:39:52.901305 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:39:53 crc kubenswrapper[4676]: I1204 15:39:53.465919 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:53 crc kubenswrapper[4676]: E1204 15:39:53.618539 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:53 crc kubenswrapper[4676]: E1204 15:39:53.621111 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:53 crc kubenswrapper[4676]: E1204 15:39:53.624465 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:53 crc kubenswrapper[4676]: E1204 15:39:53.624651 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:39:53 crc kubenswrapper[4676]: I1204 15:39:53.690080 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api" containerID="cri-o://f9e9d71737fff903ea0b44cae1b98a8c216ca28e67b50eca54687db73ad34cf6" gracePeriod=30 Dec 04 15:39:53 crc kubenswrapper[4676]: I1204 15:39:53.694204 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api-log" containerID="cri-o://4e16aacef709c51fca2b919af307e5a694d4f6297444f0e8223e61356d7f932b" gracePeriod=30 Dec 04 15:39:54 crc kubenswrapper[4676]: I1204 15:39:54.712139 4676 generic.go:334] "Generic (PLEG): container finished" podID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerID="4e16aacef709c51fca2b919af307e5a694d4f6297444f0e8223e61356d7f932b" exitCode=143 Dec 04 15:39:54 crc kubenswrapper[4676]: I1204 15:39:54.712210 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"30695e65-6a6b-4fd3-b913-592efbdb6e59","Type":"ContainerDied","Data":"4e16aacef709c51fca2b919af307e5a694d4f6297444f0e8223e61356d7f932b"} Dec 04 15:39:55 crc kubenswrapper[4676]: I1204 15:39:55.490240 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.161:9322/\": dial tcp 10.217.0.161:9322: connect: connection refused" Dec 04 15:39:55 crc kubenswrapper[4676]: I1204 15:39:55.490240 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api-log" probeResult="failure" output="Get \"http://10.217.0.161:9322/\": dial tcp 10.217.0.161:9322: connect: connection refused" Dec 04 15:39:55 crc kubenswrapper[4676]: I1204 15:39:55.727213 4676 generic.go:334] "Generic (PLEG): container finished" podID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerID="f9e9d71737fff903ea0b44cae1b98a8c216ca28e67b50eca54687db73ad34cf6" exitCode=0 Dec 04 15:39:55 crc kubenswrapper[4676]: I1204 15:39:55.727261 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"30695e65-6a6b-4fd3-b913-592efbdb6e59","Type":"ContainerDied","Data":"f9e9d71737fff903ea0b44cae1b98a8c216ca28e67b50eca54687db73ad34cf6"} Dec 04 15:39:55 crc kubenswrapper[4676]: I1204 15:39:55.933509 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:55 crc kubenswrapper[4676]: I1204 15:39:55.988239 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:55 crc kubenswrapper[4676]: I1204 15:39:55.998849 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.098969 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-combined-ca-bundle\") pod \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099419 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-scripts\") pod \"89c93c13-31d1-4762-9457-90e32c63873e\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099473 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-config-data\") pod \"89c93c13-31d1-4762-9457-90e32c63873e\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099529 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c93c13-31d1-4762-9457-90e32c63873e-logs\") pod \"89c93c13-31d1-4762-9457-90e32c63873e\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099596 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-scripts\") pod \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099636 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dznx9\" (UniqueName: \"kubernetes.io/projected/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-kube-api-access-dznx9\") pod \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099665 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-fernet-keys\") pod \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099706 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-combined-ca-bundle\") pod \"89c93c13-31d1-4762-9457-90e32c63873e\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099734 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-credential-keys\") pod \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.099848 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68b9f\" (UniqueName: \"kubernetes.io/projected/89c93c13-31d1-4762-9457-90e32c63873e-kube-api-access-68b9f\") pod \"89c93c13-31d1-4762-9457-90e32c63873e\" (UID: \"89c93c13-31d1-4762-9457-90e32c63873e\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.102003 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-config-data\") pod \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\" (UID: \"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.107504 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c93c13-31d1-4762-9457-90e32c63873e-logs" (OuterVolumeSpecName: "logs") pod "89c93c13-31d1-4762-9457-90e32c63873e" (UID: "89c93c13-31d1-4762-9457-90e32c63873e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.111495 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-scripts" (OuterVolumeSpecName: "scripts") pod "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" (UID: "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.111629 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" (UID: "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.112577 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-kube-api-access-dznx9" (OuterVolumeSpecName: "kube-api-access-dznx9") pod "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" (UID: "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9"). InnerVolumeSpecName "kube-api-access-dznx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.112681 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c93c13-31d1-4762-9457-90e32c63873e-kube-api-access-68b9f" (OuterVolumeSpecName: "kube-api-access-68b9f") pod "89c93c13-31d1-4762-9457-90e32c63873e" (UID: "89c93c13-31d1-4762-9457-90e32c63873e"). InnerVolumeSpecName "kube-api-access-68b9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.117476 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-scripts" (OuterVolumeSpecName: "scripts") pod "89c93c13-31d1-4762-9457-90e32c63873e" (UID: "89c93c13-31d1-4762-9457-90e32c63873e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.128576 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" (UID: "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.146380 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-config-data" (OuterVolumeSpecName: "config-data") pod "89c93c13-31d1-4762-9457-90e32c63873e" (UID: "89c93c13-31d1-4762-9457-90e32c63873e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.147639 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" (UID: "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.152289 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-config-data" (OuterVolumeSpecName: "config-data") pod "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" (UID: "b7efd4bd-bb88-4422-9bd3-04ddb66d35a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.191969 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89c93c13-31d1-4762-9457-90e32c63873e" (UID: "89c93c13-31d1-4762-9457-90e32c63873e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.205661 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-combined-ca-bundle\") pod \"4feecc1c-e63e-4063-947d-4c2c619525a7\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.205772 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-db-sync-config-data\") pod \"4feecc1c-e63e-4063-947d-4c2c619525a7\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.205829 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7lbp\" (UniqueName: \"kubernetes.io/projected/4feecc1c-e63e-4063-947d-4c2c619525a7-kube-api-access-v7lbp\") pod \"4feecc1c-e63e-4063-947d-4c2c619525a7\" (UID: \"4feecc1c-e63e-4063-947d-4c2c619525a7\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206304 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206327 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206340 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89c93c13-31d1-4762-9457-90e32c63873e-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206349 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206357 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dznx9\" (UniqueName: \"kubernetes.io/projected/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-kube-api-access-dznx9\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206366 4676 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206375 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89c93c13-31d1-4762-9457-90e32c63873e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206383 4676 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206391 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68b9f\" (UniqueName: \"kubernetes.io/projected/89c93c13-31d1-4762-9457-90e32c63873e-kube-api-access-68b9f\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206398 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.206406 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.210202 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4feecc1c-e63e-4063-947d-4c2c619525a7-kube-api-access-v7lbp" (OuterVolumeSpecName: "kube-api-access-v7lbp") pod "4feecc1c-e63e-4063-947d-4c2c619525a7" (UID: "4feecc1c-e63e-4063-947d-4c2c619525a7"). InnerVolumeSpecName "kube-api-access-v7lbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.216669 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4feecc1c-e63e-4063-947d-4c2c619525a7" (UID: "4feecc1c-e63e-4063-947d-4c2c619525a7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.220302 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.265190 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4feecc1c-e63e-4063-947d-4c2c619525a7" (UID: "4feecc1c-e63e-4063-947d-4c2c619525a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.308034 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7lbp\" (UniqueName: \"kubernetes.io/projected/4feecc1c-e63e-4063-947d-4c2c619525a7-kube-api-access-v7lbp\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.308295 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.308379 4676 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4feecc1c-e63e-4063-947d-4c2c619525a7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.409519 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-custom-prometheus-ca\") pod \"30695e65-6a6b-4fd3-b913-592efbdb6e59\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.409590 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdwlf\" (UniqueName: \"kubernetes.io/projected/30695e65-6a6b-4fd3-b913-592efbdb6e59-kube-api-access-hdwlf\") pod \"30695e65-6a6b-4fd3-b913-592efbdb6e59\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.409619 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30695e65-6a6b-4fd3-b913-592efbdb6e59-logs\") pod \"30695e65-6a6b-4fd3-b913-592efbdb6e59\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.409640 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-config-data\") pod \"30695e65-6a6b-4fd3-b913-592efbdb6e59\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.409723 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-combined-ca-bundle\") pod \"30695e65-6a6b-4fd3-b913-592efbdb6e59\" (UID: \"30695e65-6a6b-4fd3-b913-592efbdb6e59\") " Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.411010 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30695e65-6a6b-4fd3-b913-592efbdb6e59-logs" (OuterVolumeSpecName: "logs") pod "30695e65-6a6b-4fd3-b913-592efbdb6e59" (UID: "30695e65-6a6b-4fd3-b913-592efbdb6e59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.412305 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30695e65-6a6b-4fd3-b913-592efbdb6e59-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.415522 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30695e65-6a6b-4fd3-b913-592efbdb6e59-kube-api-access-hdwlf" (OuterVolumeSpecName: "kube-api-access-hdwlf") pod "30695e65-6a6b-4fd3-b913-592efbdb6e59" (UID: "30695e65-6a6b-4fd3-b913-592efbdb6e59"). InnerVolumeSpecName "kube-api-access-hdwlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.436667 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "30695e65-6a6b-4fd3-b913-592efbdb6e59" (UID: "30695e65-6a6b-4fd3-b913-592efbdb6e59"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.459354 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30695e65-6a6b-4fd3-b913-592efbdb6e59" (UID: "30695e65-6a6b-4fd3-b913-592efbdb6e59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.474542 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-config-data" (OuterVolumeSpecName: "config-data") pod "30695e65-6a6b-4fd3-b913-592efbdb6e59" (UID: "30695e65-6a6b-4fd3-b913-592efbdb6e59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.514007 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.514049 4676 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.514064 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdwlf\" (UniqueName: \"kubernetes.io/projected/30695e65-6a6b-4fd3-b913-592efbdb6e59-kube-api-access-hdwlf\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.514080 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30695e65-6a6b-4fd3-b913-592efbdb6e59-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.737850 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jlg26" event={"ID":"89c93c13-31d1-4762-9457-90e32c63873e","Type":"ContainerDied","Data":"9131dfc0f2beff70028a7dd02300fcecae8b70b1388bd9bb748563e5edb563b7"} Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.737957 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9131dfc0f2beff70028a7dd02300fcecae8b70b1388bd9bb748563e5edb563b7" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.738001 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jlg26" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.742823 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"30695e65-6a6b-4fd3-b913-592efbdb6e59","Type":"ContainerDied","Data":"920e42e2f6d1710ff7746d4e8c9fd2a421138a0fd5e72681a7cefcf8ea903df6"} Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.742873 4676 scope.go:117] "RemoveContainer" containerID="f9e9d71737fff903ea0b44cae1b98a8c216ca28e67b50eca54687db73ad34cf6" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.743013 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.753243 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-llvh8" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.753418 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-llvh8" event={"ID":"b7efd4bd-bb88-4422-9bd3-04ddb66d35a9","Type":"ContainerDied","Data":"cf0fd718398c0fd6ca6178cf80dbebc069329efe9e2732dc7d0757afc8a93607"} Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.753454 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf0fd718398c0fd6ca6178cf80dbebc069329efe9e2732dc7d0757afc8a93607" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.756269 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-6b4sd" event={"ID":"4feecc1c-e63e-4063-947d-4c2c619525a7","Type":"ContainerDied","Data":"51c5b9d3a7fad3f4558a2774d0f486b3ef89b76f2e698a1fe200334f1d8fca30"} Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.756311 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c5b9d3a7fad3f4558a2774d0f486b3ef89b76f2e698a1fe200334f1d8fca30" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.756375 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-6b4sd" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.759973 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerStarted","Data":"d3dce1564d44980c735df7f44391fc16dd797b13c0b45bfcf54a92cd9508f17d"} Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.765706 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerStarted","Data":"57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f"} Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.816851 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.856016 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.861992 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:56 crc kubenswrapper[4676]: E1204 15:39:56.862543 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c93c13-31d1-4762-9457-90e32c63873e" containerName="placement-db-sync" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862578 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c93c13-31d1-4762-9457-90e32c63873e" containerName="placement-db-sync" Dec 04 15:39:56 crc kubenswrapper[4676]: E1204 15:39:56.862593 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api-log" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862601 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api-log" Dec 04 15:39:56 crc kubenswrapper[4676]: E1204 15:39:56.862609 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862616 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" Dec 04 15:39:56 crc kubenswrapper[4676]: E1204 15:39:56.862630 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="init" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862638 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="init" Dec 04 15:39:56 crc kubenswrapper[4676]: E1204 15:39:56.862652 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" containerName="keystone-bootstrap" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862657 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" containerName="keystone-bootstrap" Dec 04 15:39:56 crc kubenswrapper[4676]: E1204 15:39:56.862673 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4feecc1c-e63e-4063-947d-4c2c619525a7" containerName="barbican-db-sync" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862679 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4feecc1c-e63e-4063-947d-4c2c619525a7" containerName="barbican-db-sync" Dec 04 15:39:56 crc kubenswrapper[4676]: E1204 15:39:56.862690 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862698 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862886 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4feecc1c-e63e-4063-947d-4c2c619525a7" containerName="barbican-db-sync" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862899 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862925 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c93c13-31d1-4762-9457-90e32c63873e" containerName="placement-db-sync" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862944 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" containerName="keystone-bootstrap" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862954 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" containerName="watcher-api-log" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.862967 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="99411ac6-aa35-4f96-bf75-783e3dcbdf93" containerName="dnsmasq-dns" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.864148 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.867006 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.867006 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.867008 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.874564 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.920416 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-logs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.920505 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-config-data\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.920532 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.920575 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.920654 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.920692 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:56 crc kubenswrapper[4676]: I1204 15:39:56.920779 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thqvc\" (UniqueName: \"kubernetes.io/projected/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-kube-api-access-thqvc\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.026836 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-logs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.027009 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-config-data\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.027043 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.027105 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.027236 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.027300 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.027447 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thqvc\" (UniqueName: \"kubernetes.io/projected/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-kube-api-access-thqvc\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.027503 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-logs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.033430 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.039443 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-config-data\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.040697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.043001 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.043271 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.047626 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thqvc\" (UniqueName: \"kubernetes.io/projected/5bd9cd7f-a3cb-4304-9ce9-73903875b9cd-kube-api-access-thqvc\") pod \"watcher-api-0\" (UID: \"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd\") " pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.079267 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-75f9dc548b-ctwhb"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.082844 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.089214 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.089284 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.089618 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-55k2j" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.089744 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.090111 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.115964 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75f9dc548b-ctwhb"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.129499 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-combined-ca-bundle\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.129565 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-internal-tls-certs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.129619 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-config-data\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.129640 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-public-tls-certs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.129674 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bea0dc8-b7f4-4623-95a6-813e42180090-logs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.129722 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rhsr\" (UniqueName: \"kubernetes.io/projected/0bea0dc8-b7f4-4623-95a6-813e42180090-kube-api-access-2rhsr\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.129744 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-scripts\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.177084 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-97885899c-28t7l"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.180890 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.185997 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vsxn6" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.186157 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.186176 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.186298 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.187984 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.188309 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.188801 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.192329 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-97885899c-28t7l"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235579 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-internal-tls-certs\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235652 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-config-data\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235683 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-config-data\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235710 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-public-tls-certs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235758 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-public-tls-certs\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235780 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bea0dc8-b7f4-4623-95a6-813e42180090-logs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235800 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhv9c\" (UniqueName: \"kubernetes.io/projected/2d21c3e9-53ed-4671-b832-04c115971b6c-kube-api-access-fhv9c\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235872 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rhsr\" (UniqueName: \"kubernetes.io/projected/0bea0dc8-b7f4-4623-95a6-813e42180090-kube-api-access-2rhsr\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235896 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-scripts\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235946 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-combined-ca-bundle\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.235970 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-scripts\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.236014 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-combined-ca-bundle\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.236037 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-fernet-keys\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.236070 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-credential-keys\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.236104 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-internal-tls-certs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.241360 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bea0dc8-b7f4-4623-95a6-813e42180090-logs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.243078 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-internal-tls-certs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.245565 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-public-tls-certs\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.249326 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-config-data\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.254429 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-scripts\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.266625 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rhsr\" (UniqueName: \"kubernetes.io/projected/0bea0dc8-b7f4-4623-95a6-813e42180090-kube-api-access-2rhsr\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.270736 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bea0dc8-b7f4-4623-95a6-813e42180090-combined-ca-bundle\") pod \"placement-75f9dc548b-ctwhb\" (UID: \"0bea0dc8-b7f4-4623-95a6-813e42180090\") " pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337477 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-combined-ca-bundle\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337548 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-scripts\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337608 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-fernet-keys\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337641 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-credential-keys\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337716 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-internal-tls-certs\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337745 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-config-data\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337802 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-public-tls-certs\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.337820 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhv9c\" (UniqueName: \"kubernetes.io/projected/2d21c3e9-53ed-4671-b832-04c115971b6c-kube-api-access-fhv9c\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.350432 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-public-tls-certs\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.352090 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-config-data\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.353675 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-fernet-keys\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.353725 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-combined-ca-bundle\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.355626 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-scripts\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.371634 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-credential-keys\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.385503 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d21c3e9-53ed-4671-b832-04c115971b6c-internal-tls-certs\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.414145 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhv9c\" (UniqueName: \"kubernetes.io/projected/2d21c3e9-53ed-4671-b832-04c115971b6c-kube-api-access-fhv9c\") pod \"keystone-97885899c-28t7l\" (UID: \"2d21c3e9-53ed-4671-b832-04c115971b6c\") " pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.441082 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30695e65-6a6b-4fd3-b913-592efbdb6e59" path="/var/lib/kubelet/pods/30695e65-6a6b-4fd3-b913-592efbdb6e59/volumes" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.453989 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.498016 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7d8d8c7d4-6r94k"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.505165 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-97885899c-28t7l" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.516806 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-d67df5bf5-pk5hl"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.517640 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.524352 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.527682 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.528507 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-r22qq" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.529070 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.529236 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.556120 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d8d8c7d4-6r94k"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566036 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlbnr\" (UniqueName: \"kubernetes.io/projected/cf2c938b-0504-4743-95aa-40338211a37c-kube-api-access-hlbnr\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566124 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsbzs\" (UniqueName: \"kubernetes.io/projected/8ca5926d-be39-4cda-b11d-bbed877ffa22-kube-api-access-xsbzs\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566156 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2c938b-0504-4743-95aa-40338211a37c-logs\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566275 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-combined-ca-bundle\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566373 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-config-data\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566420 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-config-data-custom\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566456 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca5926d-be39-4cda-b11d-bbed877ffa22-logs\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-config-data-custom\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566524 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-config-data\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.566563 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-combined-ca-bundle\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.575804 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d67df5bf5-pk5hl"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.608600 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6845bf8cdc-5xmc9"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.610777 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.643471 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6845bf8cdc-5xmc9"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-sb\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669289 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-nb\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669336 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-combined-ca-bundle\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669365 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-swift-storage-0\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669568 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-config-data\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669709 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-config-data-custom\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669763 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca5926d-be39-4cda-b11d-bbed877ffa22-logs\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669796 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-config-data-custom\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.669835 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-config-data\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.670123 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-svc\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.670163 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-combined-ca-bundle\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.670528 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg9cd\" (UniqueName: \"kubernetes.io/projected/40ae722b-54ad-4066-b690-5639be42c4f7-kube-api-access-pg9cd\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.670597 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlbnr\" (UniqueName: \"kubernetes.io/projected/cf2c938b-0504-4743-95aa-40338211a37c-kube-api-access-hlbnr\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.671474 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-config\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.671538 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsbzs\" (UniqueName: \"kubernetes.io/projected/8ca5926d-be39-4cda-b11d-bbed877ffa22-kube-api-access-xsbzs\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.671562 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2c938b-0504-4743-95aa-40338211a37c-logs\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.671596 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ca5926d-be39-4cda-b11d-bbed877ffa22-logs\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.672417 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf2c938b-0504-4743-95aa-40338211a37c-logs\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.679166 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-config-data\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.679735 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-config-data-custom\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.681314 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-config-data\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.682092 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-config-data-custom\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.687722 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ca5926d-be39-4cda-b11d-bbed877ffa22-combined-ca-bundle\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.689856 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf2c938b-0504-4743-95aa-40338211a37c-combined-ca-bundle\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.710024 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlbnr\" (UniqueName: \"kubernetes.io/projected/cf2c938b-0504-4743-95aa-40338211a37c-kube-api-access-hlbnr\") pod \"barbican-keystone-listener-7d8d8c7d4-6r94k\" (UID: \"cf2c938b-0504-4743-95aa-40338211a37c\") " pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.714773 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsbzs\" (UniqueName: \"kubernetes.io/projected/8ca5926d-be39-4cda-b11d-bbed877ffa22-kube-api-access-xsbzs\") pod \"barbican-worker-d67df5bf5-pk5hl\" (UID: \"8ca5926d-be39-4cda-b11d-bbed877ffa22\") " pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.719033 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-bbb65f7b4-kp2f2"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.722096 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.727346 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.750242 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bbb65f7b4-kp2f2"] Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.772761 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-svc\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.772804 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.772836 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg9cd\" (UniqueName: \"kubernetes.io/projected/40ae722b-54ad-4066-b690-5639be42c4f7-kube-api-access-pg9cd\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.772866 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-config\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.772889 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-combined-ca-bundle\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.773306 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-266bq\" (UniqueName: \"kubernetes.io/projected/eea76c68-cc6a-4647-af40-c0ebf21b1226-kube-api-access-266bq\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.773368 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data-custom\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.773393 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-sb\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.773411 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-nb\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.773434 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-swift-storage-0\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.773458 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea76c68-cc6a-4647-af40-c0ebf21b1226-logs\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.774468 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-svc\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.775122 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-nb\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.775731 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-swift-storage-0\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.776615 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-sb\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.776945 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-config\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.792556 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg9cd\" (UniqueName: \"kubernetes.io/projected/40ae722b-54ad-4066-b690-5639be42c4f7-kube-api-access-pg9cd\") pod \"dnsmasq-dns-6845bf8cdc-5xmc9\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.792772 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr52" event={"ID":"c8534e22-ee3e-4b6c-92a8-1790b69f335d","Type":"ContainerStarted","Data":"d1f4f8e5e1f465b90a63581e1555bf9447784bf91a9c5d224acf43b302f36460"} Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.820106 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-nnr52" podStartSLOduration=5.6752295230000005 podStartE2EDuration="54.820062871s" podCreationTimestamp="2025-12-04 15:39:03 +0000 UTC" firstStartedPulling="2025-12-04 15:39:06.877208081 +0000 UTC m=+1154.311877928" lastFinishedPulling="2025-12-04 15:39:56.022041419 +0000 UTC m=+1203.456711276" observedRunningTime="2025-12-04 15:39:57.813218051 +0000 UTC m=+1205.247887908" watchObservedRunningTime="2025-12-04 15:39:57.820062871 +0000 UTC m=+1205.254732718" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.875229 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-combined-ca-bundle\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.875486 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-266bq\" (UniqueName: \"kubernetes.io/projected/eea76c68-cc6a-4647-af40-c0ebf21b1226-kube-api-access-266bq\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.875601 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data-custom\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.875707 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea76c68-cc6a-4647-af40-c0ebf21b1226-logs\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.875974 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.878529 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea76c68-cc6a-4647-af40-c0ebf21b1226-logs\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.879684 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.885678 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data-custom\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.889533 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.889616 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-combined-ca-bundle\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.906267 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-266bq\" (UniqueName: \"kubernetes.io/projected/eea76c68-cc6a-4647-af40-c0ebf21b1226-kube-api-access-266bq\") pod \"barbican-api-bbb65f7b4-kp2f2\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.932355 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d67df5bf5-pk5hl" Dec 04 15:39:57 crc kubenswrapper[4676]: I1204 15:39:57.959429 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:39:58 crc kubenswrapper[4676]: I1204 15:39:58.050750 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:39:58 crc kubenswrapper[4676]: E1204 15:39:58.614432 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:58 crc kubenswrapper[4676]: E1204 15:39:58.617197 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:58 crc kubenswrapper[4676]: E1204 15:39:58.618666 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:39:58 crc kubenswrapper[4676]: E1204 15:39:58.618742 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.739120 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c64fd75cd-msd6p"] Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.740978 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.743086 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.745258 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.753427 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c64fd75cd-msd6p"] Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.833619 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-combined-ca-bundle\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.833669 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa2202e-331f-46d6-b6a8-e7b6484029f6-logs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.833706 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-internal-tls-certs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.833755 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-public-tls-certs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.833794 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-config-data\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.833821 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhd9w\" (UniqueName: \"kubernetes.io/projected/baa2202e-331f-46d6-b6a8-e7b6484029f6-kube-api-access-vhd9w\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.833881 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-config-data-custom\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.935362 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-combined-ca-bundle\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.935421 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa2202e-331f-46d6-b6a8-e7b6484029f6-logs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.935460 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-internal-tls-certs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.935509 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-public-tls-certs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.935541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-config-data\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.935561 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhd9w\" (UniqueName: \"kubernetes.io/projected/baa2202e-331f-46d6-b6a8-e7b6484029f6-kube-api-access-vhd9w\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.935611 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-config-data-custom\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.937109 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa2202e-331f-46d6-b6a8-e7b6484029f6-logs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.942819 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-public-tls-certs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.945575 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-internal-tls-certs\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.945857 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-config-data-custom\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.945953 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-combined-ca-bundle\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.951832 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa2202e-331f-46d6-b6a8-e7b6484029f6-config-data\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:00 crc kubenswrapper[4676]: I1204 15:40:00.976213 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhd9w\" (UniqueName: \"kubernetes.io/projected/baa2202e-331f-46d6-b6a8-e7b6484029f6-kube-api-access-vhd9w\") pod \"barbican-api-6c64fd75cd-msd6p\" (UID: \"baa2202e-331f-46d6-b6a8-e7b6484029f6\") " pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:01 crc kubenswrapper[4676]: I1204 15:40:01.071034 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:01 crc kubenswrapper[4676]: I1204 15:40:01.897683 4676 generic.go:334] "Generic (PLEG): container finished" podID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerID="d3dce1564d44980c735df7f44391fc16dd797b13c0b45bfcf54a92cd9508f17d" exitCode=1 Dec 04 15:40:01 crc kubenswrapper[4676]: I1204 15:40:01.897784 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerDied","Data":"d3dce1564d44980c735df7f44391fc16dd797b13c0b45bfcf54a92cd9508f17d"} Dec 04 15:40:01 crc kubenswrapper[4676]: I1204 15:40:01.898820 4676 scope.go:117] "RemoveContainer" containerID="d3dce1564d44980c735df7f44391fc16dd797b13c0b45bfcf54a92cd9508f17d" Dec 04 15:40:01 crc kubenswrapper[4676]: E1204 15:40:01.908165 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:40:02 crc kubenswrapper[4676]: I1204 15:40:02.901204 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:02 crc kubenswrapper[4676]: I1204 15:40:02.901256 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:02 crc kubenswrapper[4676]: I1204 15:40:02.901266 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:02 crc kubenswrapper[4676]: I1204 15:40:02.901278 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:02 crc kubenswrapper[4676]: I1204 15:40:02.910896 4676 scope.go:117] "RemoveContainer" containerID="d3dce1564d44980c735df7f44391fc16dd797b13c0b45bfcf54a92cd9508f17d" Dec 04 15:40:02 crc kubenswrapper[4676]: E1204 15:40:02.911419 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:40:03 crc kubenswrapper[4676]: I1204 15:40:03.273953 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:40:03 crc kubenswrapper[4676]: I1204 15:40:03.307130 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:40:03 crc kubenswrapper[4676]: E1204 15:40:03.612702 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:03 crc kubenswrapper[4676]: E1204 15:40:03.613864 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:03 crc kubenswrapper[4676]: E1204 15:40:03.615244 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:03 crc kubenswrapper[4676]: E1204 15:40:03.615284 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.226842 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-74857cd458-nnlq7" Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.295383 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78c887c44-wcq82"] Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.300281 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78c887c44-wcq82" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon-log" containerID="cri-o://b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa" gracePeriod=30 Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.300366 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-78c887c44-wcq82" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" containerID="cri-o://061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940" gracePeriod=30 Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.305473 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78c887c44-wcq82" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.311194 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78c887c44-wcq82" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.952851 4676 generic.go:334] "Generic (PLEG): container finished" podID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerID="061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940" exitCode=0 Dec 04 15:40:05 crc kubenswrapper[4676]: I1204 15:40:05.952931 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c887c44-wcq82" event={"ID":"f68f12a3-a61b-492b-94e9-4351419cfa7b","Type":"ContainerDied","Data":"061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940"} Dec 04 15:40:06 crc kubenswrapper[4676]: E1204 15:40:06.793683 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 04 15:40:06 crc kubenswrapper[4676]: E1204 15:40:06.793791 4676 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.129.56.200:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Dec 04 15:40:06 crc kubenswrapper[4676]: E1204 15:40:06.794059 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.129.56.200:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8c86c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-pksjc_openstack(3ac7518d-e354-42a9-85e4-766e455bf838): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 15:40:06 crc kubenswrapper[4676]: E1204 15:40:06.795305 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-pksjc" podUID="3ac7518d-e354-42a9-85e4-766e455bf838" Dec 04 15:40:06 crc kubenswrapper[4676]: I1204 15:40:06.838752 4676 scope.go:117] "RemoveContainer" containerID="4e16aacef709c51fca2b919af307e5a694d4f6297444f0e8223e61356d7f932b" Dec 04 15:40:06 crc kubenswrapper[4676]: E1204 15:40:06.993386 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.129.56.200:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-pksjc" podUID="3ac7518d-e354-42a9-85e4-766e455bf838" Dec 04 15:40:06 crc kubenswrapper[4676]: I1204 15:40:06.993848 4676 scope.go:117] "RemoveContainer" containerID="06ec8a0508c5113e48edf514194942f4a4010038009e62e14430ef46574abe17" Dec 04 15:40:07 crc kubenswrapper[4676]: W1204 15:40:07.386118 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ca5926d_be39_4cda_b11d_bbed877ffa22.slice/crio-77a99ce728100ba71103e7d1e320ffa11e8269cee751d8f5c2c71a74f8ae2ed6 WatchSource:0}: Error finding container 77a99ce728100ba71103e7d1e320ffa11e8269cee751d8f5c2c71a74f8ae2ed6: Status 404 returned error can't find the container with id 77a99ce728100ba71103e7d1e320ffa11e8269cee751d8f5c2c71a74f8ae2ed6 Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.402741 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d67df5bf5-pk5hl"] Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.642761 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bbb65f7b4-kp2f2"] Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.656276 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-75f9dc548b-ctwhb"] Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.666711 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Dec 04 15:40:07 crc kubenswrapper[4676]: W1204 15:40:07.670219 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf2c938b_0504_4743_95aa_40338211a37c.slice/crio-e1921c2322bc187029a397d1a13c20360b120033eaf359d94cc00a7f79fd7bd3 WatchSource:0}: Error finding container e1921c2322bc187029a397d1a13c20360b120033eaf359d94cc00a7f79fd7bd3: Status 404 returned error can't find the container with id e1921c2322bc187029a397d1a13c20360b120033eaf359d94cc00a7f79fd7bd3 Dec 04 15:40:07 crc kubenswrapper[4676]: W1204 15:40:07.670983 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bd9cd7f_a3cb_4304_9ce9_73903875b9cd.slice/crio-a99391e4d83768d83c75e20d93233c0401fadf92e59dec0d4e0b8f7ae0d17ed7 WatchSource:0}: Error finding container a99391e4d83768d83c75e20d93233c0401fadf92e59dec0d4e0b8f7ae0d17ed7: Status 404 returned error can't find the container with id a99391e4d83768d83c75e20d93233c0401fadf92e59dec0d4e0b8f7ae0d17ed7 Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.675463 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7d8d8c7d4-6r94k"] Dec 04 15:40:07 crc kubenswrapper[4676]: W1204 15:40:07.683629 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bea0dc8_b7f4_4623_95a6_813e42180090.slice/crio-addf1700a754db237dc0b3a3172b1124ff92cc8c837394cf9632bad39a6410ce WatchSource:0}: Error finding container addf1700a754db237dc0b3a3172b1124ff92cc8c837394cf9632bad39a6410ce: Status 404 returned error can't find the container with id addf1700a754db237dc0b3a3172b1124ff92cc8c837394cf9632bad39a6410ce Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.878171 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-97885899c-28t7l"] Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.889384 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6845bf8cdc-5xmc9"] Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.898384 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c64fd75cd-msd6p"] Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.994946 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c64fd75cd-msd6p" event={"ID":"baa2202e-331f-46d6-b6a8-e7b6484029f6","Type":"ContainerStarted","Data":"cc6c867779ac7ce77b341d50ed7ea1648b7ee079cae5ba1483997a7ec1589016"} Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.997729 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" event={"ID":"40ae722b-54ad-4066-b690-5639be42c4f7","Type":"ContainerStarted","Data":"d1f52e58125da086b8a975e8c1725a4d0a00e45c0fb65ef8a1228d125c8c5c68"} Dec 04 15:40:07 crc kubenswrapper[4676]: I1204 15:40:07.998843 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbb65f7b4-kp2f2" event={"ID":"eea76c68-cc6a-4647-af40-c0ebf21b1226","Type":"ContainerStarted","Data":"db8f43b4e80885e576533d60eef5e5c0270919937af6e618c67eeed395ad0e37"} Dec 04 15:40:08 crc kubenswrapper[4676]: I1204 15:40:08.001255 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75f9dc548b-ctwhb" event={"ID":"0bea0dc8-b7f4-4623-95a6-813e42180090","Type":"ContainerStarted","Data":"addf1700a754db237dc0b3a3172b1124ff92cc8c837394cf9632bad39a6410ce"} Dec 04 15:40:08 crc kubenswrapper[4676]: I1204 15:40:08.002772 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" event={"ID":"cf2c938b-0504-4743-95aa-40338211a37c","Type":"ContainerStarted","Data":"e1921c2322bc187029a397d1a13c20360b120033eaf359d94cc00a7f79fd7bd3"} Dec 04 15:40:08 crc kubenswrapper[4676]: I1204 15:40:08.004292 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d67df5bf5-pk5hl" event={"ID":"8ca5926d-be39-4cda-b11d-bbed877ffa22","Type":"ContainerStarted","Data":"77a99ce728100ba71103e7d1e320ffa11e8269cee751d8f5c2c71a74f8ae2ed6"} Dec 04 15:40:08 crc kubenswrapper[4676]: I1204 15:40:08.005896 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd","Type":"ContainerStarted","Data":"a99391e4d83768d83c75e20d93233c0401fadf92e59dec0d4e0b8f7ae0d17ed7"} Dec 04 15:40:08 crc kubenswrapper[4676]: I1204 15:40:08.007027 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97885899c-28t7l" event={"ID":"2d21c3e9-53ed-4671-b832-04c115971b6c","Type":"ContainerStarted","Data":"6d72e9acf57d519d465f68e84689958ccf4e2a6f88c823ab5f69c35121ce0018"} Dec 04 15:40:08 crc kubenswrapper[4676]: E1204 15:40:08.613421 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:08 crc kubenswrapper[4676]: E1204 15:40:08.615518 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:08 crc kubenswrapper[4676]: E1204 15:40:08.619037 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:08 crc kubenswrapper[4676]: E1204 15:40:08.619121 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.021263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c64fd75cd-msd6p" event={"ID":"baa2202e-331f-46d6-b6a8-e7b6484029f6","Type":"ContainerStarted","Data":"71ec9ff1fcddef901aa9448b2543b87ebed569438ba55e0b6f406229ab7612fc"} Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.023020 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd","Type":"ContainerStarted","Data":"33d8e6a1f171a6aa2ce696c0375e4a547b8725f51cb29f499c28244830a0bef3"} Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.024462 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-97885899c-28t7l" event={"ID":"2d21c3e9-53ed-4671-b832-04c115971b6c","Type":"ContainerStarted","Data":"9015db188c76564b95fecc18328fbda53db8510638e8a8dd7dbe475ae4f17119"} Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.024633 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-97885899c-28t7l" Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.026068 4676 generic.go:334] "Generic (PLEG): container finished" podID="40ae722b-54ad-4066-b690-5639be42c4f7" containerID="767196b55b820c811f159ad655fdad46b26d039f0b4a40b416c3f227556037b7" exitCode=0 Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.026311 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" event={"ID":"40ae722b-54ad-4066-b690-5639be42c4f7","Type":"ContainerDied","Data":"767196b55b820c811f159ad655fdad46b26d039f0b4a40b416c3f227556037b7"} Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.027468 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbb65f7b4-kp2f2" event={"ID":"eea76c68-cc6a-4647-af40-c0ebf21b1226","Type":"ContainerStarted","Data":"f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c"} Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.030150 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75f9dc548b-ctwhb" event={"ID":"0bea0dc8-b7f4-4623-95a6-813e42180090","Type":"ContainerStarted","Data":"7bde574e24145204ffa3a110c2e27e3b20571737aebbd37c5f9e6e91ce91e0a7"} Dec 04 15:40:09 crc kubenswrapper[4676]: I1204 15:40:09.054723 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-97885899c-28t7l" podStartSLOduration=12.054701832 podStartE2EDuration="12.054701832s" podCreationTimestamp="2025-12-04 15:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:09.040771265 +0000 UTC m=+1216.475441122" watchObservedRunningTime="2025-12-04 15:40:09.054701832 +0000 UTC m=+1216.489371679" Dec 04 15:40:11 crc kubenswrapper[4676]: I1204 15:40:11.049497 4676 generic.go:334] "Generic (PLEG): container finished" podID="c8534e22-ee3e-4b6c-92a8-1790b69f335d" containerID="d1f4f8e5e1f465b90a63581e1555bf9447784bf91a9c5d224acf43b302f36460" exitCode=0 Dec 04 15:40:11 crc kubenswrapper[4676]: I1204 15:40:11.049584 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr52" event={"ID":"c8534e22-ee3e-4b6c-92a8-1790b69f335d","Type":"ContainerDied","Data":"d1f4f8e5e1f465b90a63581e1555bf9447784bf91a9c5d224acf43b302f36460"} Dec 04 15:40:11 crc kubenswrapper[4676]: I1204 15:40:11.270431 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78c887c44-wcq82" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Dec 04 15:40:12 crc kubenswrapper[4676]: I1204 15:40:12.064100 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c64fd75cd-msd6p" event={"ID":"baa2202e-331f-46d6-b6a8-e7b6484029f6","Type":"ContainerStarted","Data":"da96d1e0c11cd81fe7983889b21438c31440eb0d3317a15e6ef0a109151f67ad"} Dec 04 15:40:12 crc kubenswrapper[4676]: I1204 15:40:12.064498 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:12 crc kubenswrapper[4676]: I1204 15:40:12.064512 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:12 crc kubenswrapper[4676]: I1204 15:40:12.109760 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c64fd75cd-msd6p" podStartSLOduration=12.109731709 podStartE2EDuration="12.109731709s" podCreationTimestamp="2025-12-04 15:40:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:12.092714502 +0000 UTC m=+1219.527384359" watchObservedRunningTime="2025-12-04 15:40:12.109731709 +0000 UTC m=+1219.544401566" Dec 04 15:40:13 crc kubenswrapper[4676]: E1204 15:40:13.612898 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:13 crc kubenswrapper[4676]: E1204 15:40:13.615608 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:13 crc kubenswrapper[4676]: E1204 15:40:13.617051 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Dec 04 15:40:13 crc kubenswrapper[4676]: E1204 15:40:13.617127 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.085662 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6c64fd75cd-msd6p" podUID="baa2202e-331f-46d6-b6a8-e7b6484029f6" containerName="barbican-api-log" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.108618 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-nnr52" event={"ID":"c8534e22-ee3e-4b6c-92a8-1790b69f335d","Type":"ContainerDied","Data":"9b573d1245392b34f4ba6726d28ccd8b4ddbbd5e64c607e235ec0ff8c4d7e6a7"} Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.108666 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b573d1245392b34f4ba6726d28ccd8b4ddbbd5e64c607e235ec0ff8c4d7e6a7" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.122414 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr52" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.271083 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq29l\" (UniqueName: \"kubernetes.io/projected/c8534e22-ee3e-4b6c-92a8-1790b69f335d-kube-api-access-wq29l\") pod \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.271751 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-combined-ca-bundle\") pod \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.271954 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-scripts\") pod \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.272056 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-config-data\") pod \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.272085 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8534e22-ee3e-4b6c-92a8-1790b69f335d-etc-machine-id\") pod \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.272150 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-db-sync-config-data\") pod \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\" (UID: \"c8534e22-ee3e-4b6c-92a8-1790b69f335d\") " Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.274030 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c8534e22-ee3e-4b6c-92a8-1790b69f335d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c8534e22-ee3e-4b6c-92a8-1790b69f335d" (UID: "c8534e22-ee3e-4b6c-92a8-1790b69f335d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.277475 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8534e22-ee3e-4b6c-92a8-1790b69f335d-kube-api-access-wq29l" (OuterVolumeSpecName: "kube-api-access-wq29l") pod "c8534e22-ee3e-4b6c-92a8-1790b69f335d" (UID: "c8534e22-ee3e-4b6c-92a8-1790b69f335d"). InnerVolumeSpecName "kube-api-access-wq29l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.282618 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-scripts" (OuterVolumeSpecName: "scripts") pod "c8534e22-ee3e-4b6c-92a8-1790b69f335d" (UID: "c8534e22-ee3e-4b6c-92a8-1790b69f335d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.289049 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c8534e22-ee3e-4b6c-92a8-1790b69f335d" (UID: "c8534e22-ee3e-4b6c-92a8-1790b69f335d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.313660 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8534e22-ee3e-4b6c-92a8-1790b69f335d" (UID: "c8534e22-ee3e-4b6c-92a8-1790b69f335d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.360226 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-config-data" (OuterVolumeSpecName: "config-data") pod "c8534e22-ee3e-4b6c-92a8-1790b69f335d" (UID: "c8534e22-ee3e-4b6c-92a8-1790b69f335d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.377814 4676 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.377866 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq29l\" (UniqueName: \"kubernetes.io/projected/c8534e22-ee3e-4b6c-92a8-1790b69f335d-kube-api-access-wq29l\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.377883 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.377925 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.377939 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8534e22-ee3e-4b6c-92a8-1790b69f335d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:16 crc kubenswrapper[4676]: I1204 15:40:16.377950 4676 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c8534e22-ee3e-4b6c-92a8-1790b69f335d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.122201 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-75f9dc548b-ctwhb" event={"ID":"0bea0dc8-b7f4-4623-95a6-813e42180090","Type":"ContainerStarted","Data":"7a682353d1e23df3309015dccac4de462114a71d8fa245961ff91d8c70840d92"} Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.124619 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.124764 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.127747 4676 generic.go:334] "Generic (PLEG): container finished" podID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" exitCode=137 Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.127947 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"aefbcd15-a508-4c33-9e9a-1e98106e3949","Type":"ContainerDied","Data":"5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537"} Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.128051 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"aefbcd15-a508-4c33-9e9a-1e98106e3949","Type":"ContainerDied","Data":"e1ffc908d8c9256437066011cf16e9ecc94c767fbda7785e52a77c16d3fe45e5"} Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.128140 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ffc908d8c9256437066011cf16e9ecc94c767fbda7785e52a77c16d3fe45e5" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.131732 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5bd9cd7f-a3cb-4304-9ce9-73903875b9cd","Type":"ContainerStarted","Data":"a7211a689a1894d67b442254b3e8edda62b6d7ac92a597bf2fd48dc948ff9d85"} Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.133044 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.142366 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" event={"ID":"40ae722b-54ad-4066-b690-5639be42c4f7","Type":"ContainerStarted","Data":"d283cd29bebe9125919aa14c5070366b65637553c2c29c614f873042dfd3c923"} Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.148828 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-75f9dc548b-ctwhb" podStartSLOduration=20.148812521 podStartE2EDuration="20.148812521s" podCreationTimestamp="2025-12-04 15:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:17.146149983 +0000 UTC m=+1224.580819840" watchObservedRunningTime="2025-12-04 15:40:17.148812521 +0000 UTC m=+1224.583482378" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.179307 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d67df5bf5-pk5hl" event={"ID":"8ca5926d-be39-4cda-b11d-bbed877ffa22","Type":"ContainerStarted","Data":"c0551a16453ea52a8922915f099f75a469c0e9d288a3c0a61853a63855caf70d"} Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.190987 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.191022 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.191288 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" podStartSLOduration=20.19126798 podStartE2EDuration="20.19126798s" podCreationTimestamp="2025-12-04 15:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:17.190449817 +0000 UTC m=+1224.625119674" watchObservedRunningTime="2025-12-04 15:40:17.19126798 +0000 UTC m=+1224.625937837" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.243290 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-nnr52" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.243446 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbb65f7b4-kp2f2" event={"ID":"eea76c68-cc6a-4647-af40-c0ebf21b1226","Type":"ContainerStarted","Data":"c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b"} Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.284723 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=21.284697699 podStartE2EDuration="21.284697699s" podCreationTimestamp="2025-12-04 15:39:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:17.217434265 +0000 UTC m=+1224.652104122" watchObservedRunningTime="2025-12-04 15:40:17.284697699 +0000 UTC m=+1224.719367556" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.286424 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podStartSLOduration=20.286418069 podStartE2EDuration="20.286418069s" podCreationTimestamp="2025-12-04 15:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:17.276403827 +0000 UTC m=+1224.711073684" watchObservedRunningTime="2025-12-04 15:40:17.286418069 +0000 UTC m=+1224.721087926" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.298300 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.388789 4676 scope.go:117] "RemoveContainer" containerID="d3dce1564d44980c735df7f44391fc16dd797b13c0b45bfcf54a92cd9508f17d" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.406519 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-combined-ca-bundle\") pod \"aefbcd15-a508-4c33-9e9a-1e98106e3949\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.406556 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjqst\" (UniqueName: \"kubernetes.io/projected/aefbcd15-a508-4c33-9e9a-1e98106e3949-kube-api-access-tjqst\") pod \"aefbcd15-a508-4c33-9e9a-1e98106e3949\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.406640 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aefbcd15-a508-4c33-9e9a-1e98106e3949-logs\") pod \"aefbcd15-a508-4c33-9e9a-1e98106e3949\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.406798 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-config-data\") pod \"aefbcd15-a508-4c33-9e9a-1e98106e3949\" (UID: \"aefbcd15-a508-4c33-9e9a-1e98106e3949\") " Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.410020 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefbcd15-a508-4c33-9e9a-1e98106e3949-logs" (OuterVolumeSpecName: "logs") pod "aefbcd15-a508-4c33-9e9a-1e98106e3949" (UID: "aefbcd15-a508-4c33-9e9a-1e98106e3949"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.442619 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:17 crc kubenswrapper[4676]: E1204 15:40:17.443136 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.443151 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:40:17 crc kubenswrapper[4676]: E1204 15:40:17.443182 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8534e22-ee3e-4b6c-92a8-1790b69f335d" containerName="cinder-db-sync" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.443188 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8534e22-ee3e-4b6c-92a8-1790b69f335d" containerName="cinder-db-sync" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.443410 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8534e22-ee3e-4b6c-92a8-1790b69f335d" containerName="cinder-db-sync" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.461178 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" containerName="watcher-applier" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.462580 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.471239 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.485533 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.485819 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-jh4hq" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.485986 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.486634 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.493314 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6845bf8cdc-5xmc9"] Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.498236 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefbcd15-a508-4c33-9e9a-1e98106e3949-kube-api-access-tjqst" (OuterVolumeSpecName: "kube-api-access-tjqst") pod "aefbcd15-a508-4c33-9e9a-1e98106e3949" (UID: "aefbcd15-a508-4c33-9e9a-1e98106e3949"). InnerVolumeSpecName "kube-api-access-tjqst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.510343 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aefbcd15-a508-4c33-9e9a-1e98106e3949-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.580085 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aefbcd15-a508-4c33-9e9a-1e98106e3949" (UID: "aefbcd15-a508-4c33-9e9a-1e98106e3949"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.588761 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b9d66887-9f4ws"] Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.590538 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.597977 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9d66887-9f4ws"] Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.616763 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.616838 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.616912 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzl8b\" (UniqueName: \"kubernetes.io/projected/468634de-1454-48d2-9a70-d9f9ac450550-kube-api-access-pzl8b\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.616961 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468634de-1454-48d2-9a70-d9f9ac450550-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.616982 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.617039 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-scripts\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.617925 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.617976 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjqst\" (UniqueName: \"kubernetes.io/projected/aefbcd15-a508-4c33-9e9a-1e98106e3949-kube-api-access-tjqst\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.688125 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-config-data" (OuterVolumeSpecName: "config-data") pod "aefbcd15-a508-4c33-9e9a-1e98106e3949" (UID: "aefbcd15-a508-4c33-9e9a-1e98106e3949"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721282 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-svc\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721364 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721399 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-swift-storage-0\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721479 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721536 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721568 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzl8b\" (UniqueName: \"kubernetes.io/projected/468634de-1454-48d2-9a70-d9f9ac450550-kube-api-access-pzl8b\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721623 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-config\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721670 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468634de-1454-48d2-9a70-d9f9ac450550-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721693 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721724 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721757 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8mz5\" (UniqueName: \"kubernetes.io/projected/5d8c669b-28cb-4230-9425-671d7d330d89-kube-api-access-f8mz5\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721792 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-scripts\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.721842 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aefbcd15-a508-4c33-9e9a-1e98106e3949-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.724836 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468634de-1454-48d2-9a70-d9f9ac450550-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.728999 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.735868 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.737406 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-scripts\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.752046 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.769405 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzl8b\" (UniqueName: \"kubernetes.io/projected/468634de-1454-48d2-9a70-d9f9ac450550-kube-api-access-pzl8b\") pod \"cinder-scheduler-0\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.823462 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.823525 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8mz5\" (UniqueName: \"kubernetes.io/projected/5d8c669b-28cb-4230-9425-671d7d330d89-kube-api-access-f8mz5\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.823617 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-svc\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.823644 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-swift-storage-0\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.823707 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.823733 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-config\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.828616 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-nb\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.828650 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-config\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.829312 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-svc\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.832275 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-swift-storage-0\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.832730 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-sb\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.835397 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.847573 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8mz5\" (UniqueName: \"kubernetes.io/projected/5d8c669b-28cb-4230-9425-671d7d330d89-kube-api-access-f8mz5\") pod \"dnsmasq-dns-7b9d66887-9f4ws\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.881228 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.885399 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.890187 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.895823 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.917547 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:17 crc kubenswrapper[4676]: I1204 15:40:17.962975 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:40:17 crc kubenswrapper[4676]: E1204 15:40:17.987257 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.030224 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.030276 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-scripts\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.030400 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.030431 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a2aa236-e94e-423d-b8ab-debb9206b6ae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.030488 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2aa236-e94e-423d-b8ab-debb9206b6ae-logs\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.030546 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.030599 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkxjx\" (UniqueName: \"kubernetes.io/projected/6a2aa236-e94e-423d-b8ab-debb9206b6ae-kube-api-access-dkxjx\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.053758 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.054588 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.136043 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkxjx\" (UniqueName: \"kubernetes.io/projected/6a2aa236-e94e-423d-b8ab-debb9206b6ae-kube-api-access-dkxjx\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.136142 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.136165 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-scripts\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.136236 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.136255 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a2aa236-e94e-423d-b8ab-debb9206b6ae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.136289 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2aa236-e94e-423d-b8ab-debb9206b6ae-logs\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.136326 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.137321 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a2aa236-e94e-423d-b8ab-debb9206b6ae-etc-machine-id\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.139860 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2aa236-e94e-423d-b8ab-debb9206b6ae-logs\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.176730 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkxjx\" (UniqueName: \"kubernetes.io/projected/6a2aa236-e94e-423d-b8ab-debb9206b6ae-kube-api-access-dkxjx\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.176878 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-scripts\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.185432 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.187326 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.190028 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data-custom\") pod \"cinder-api-0\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.196744 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="5bd9cd7f-a3cb-4304-9ce9-73903875b9cd" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.212566 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.269631 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerStarted","Data":"3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710"} Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.286754 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d67df5bf5-pk5hl" event={"ID":"8ca5926d-be39-4cda-b11d-bbed877ffa22","Type":"ContainerStarted","Data":"247c43cd675ab5bbbf19e24cf4a5a23664fbcb0d2e519fc4e1e17449d28bb7f1"} Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.305526 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerStarted","Data":"0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f"} Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.305735 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="ceilometer-notification-agent" containerID="cri-o://79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860" gracePeriod=30 Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.306038 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.306107 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="sg-core" containerID="cri-o://57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f" gracePeriod=30 Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.306087 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="proxy-httpd" containerID="cri-o://0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f" gracePeriod=30 Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.328811 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-d67df5bf5-pk5hl" podStartSLOduration=19.524221014 podStartE2EDuration="21.328790296s" podCreationTimestamp="2025-12-04 15:39:57 +0000 UTC" firstStartedPulling="2025-12-04 15:40:07.388858491 +0000 UTC m=+1214.823528348" lastFinishedPulling="2025-12-04 15:40:09.193427773 +0000 UTC m=+1216.628097630" observedRunningTime="2025-12-04 15:40:18.322181284 +0000 UTC m=+1225.756851151" watchObservedRunningTime="2025-12-04 15:40:18.328790296 +0000 UTC m=+1225.763460153" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.337439 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" event={"ID":"cf2c938b-0504-4743-95aa-40338211a37c","Type":"ContainerStarted","Data":"da372db752eec21f5dbddc761465f3c7f5f0a4dc226a8eba7923f3e94d170872"} Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.337528 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.414311 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.426989 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c64fd75cd-msd6p" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.653823 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bbb65f7b4-kp2f2"] Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.663048 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.696698 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.736884 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.738356 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.740657 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.751725 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.911977 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8700a65-1419-4467-8d99-2085481c5890-logs\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.912267 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q97mm\" (UniqueName: \"kubernetes.io/projected/b8700a65-1419-4467-8d99-2085481c5890-kube-api-access-q97mm\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.912289 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8700a65-1419-4467-8d99-2085481c5890-config-data\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.912430 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8700a65-1419-4467-8d99-2085481c5890-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:18 crc kubenswrapper[4676]: I1204 15:40:18.925709 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.015284 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8700a65-1419-4467-8d99-2085481c5890-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.015358 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8700a65-1419-4467-8d99-2085481c5890-logs\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.015392 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q97mm\" (UniqueName: \"kubernetes.io/projected/b8700a65-1419-4467-8d99-2085481c5890-kube-api-access-q97mm\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.015418 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8700a65-1419-4467-8d99-2085481c5890-config-data\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.016031 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b9d66887-9f4ws"] Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.016483 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8700a65-1419-4467-8d99-2085481c5890-logs\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.025694 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8700a65-1419-4467-8d99-2085481c5890-config-data\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.036597 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8700a65-1419-4467-8d99-2085481c5890-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.040002 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q97mm\" (UniqueName: \"kubernetes.io/projected/b8700a65-1419-4467-8d99-2085481c5890-kube-api-access-q97mm\") pod \"watcher-applier-0\" (UID: \"b8700a65-1419-4467-8d99-2085481c5890\") " pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.072483 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.211452 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.380708 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"468634de-1454-48d2-9a70-d9f9ac450550","Type":"ContainerStarted","Data":"396d0c789f4b406e6ccee60956c4a758b573217633b4f579f20733c45c9a5562"} Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.405742 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aefbcd15-a508-4c33-9e9a-1e98106e3949" path="/var/lib/kubelet/pods/aefbcd15-a508-4c33-9e9a-1e98106e3949/volumes" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.414229 4676 generic.go:334] "Generic (PLEG): container finished" podID="6cfbf976-db77-44d0-9a80-83648d806eea" containerID="57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f" exitCode=2 Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.414345 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerDied","Data":"57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f"} Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.448208 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" event={"ID":"5d8c669b-28cb-4230-9425-671d7d330d89","Type":"ContainerStarted","Data":"1490b9298684c99c84f5a1f31efbb85bbbc4850f1fef4aa3a5954e0a802baa2a"} Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.473240 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" event={"ID":"cf2c938b-0504-4743-95aa-40338211a37c","Type":"ContainerStarted","Data":"8ce82d2d553d4245351d99d99e4f91ab7919d5fb15698df01d0928800a433518"} Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.474128 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.474221 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.474729 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.475367 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" podUID="40ae722b-54ad-4066-b690-5639be42c4f7" containerName="dnsmasq-dns" containerID="cri-o://d283cd29bebe9125919aa14c5070366b65637553c2c29c614f873042dfd3c923" gracePeriod=10 Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.512789 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7d8d8c7d4-6r94k" podStartSLOduration=18.495468653 podStartE2EDuration="22.512764798s" podCreationTimestamp="2025-12-04 15:39:57 +0000 UTC" firstStartedPulling="2025-12-04 15:40:07.678009874 +0000 UTC m=+1215.112679731" lastFinishedPulling="2025-12-04 15:40:11.695306019 +0000 UTC m=+1219.129975876" observedRunningTime="2025-12-04 15:40:19.510418129 +0000 UTC m=+1226.945087996" watchObservedRunningTime="2025-12-04 15:40:19.512764798 +0000 UTC m=+1226.947434655" Dec 04 15:40:19 crc kubenswrapper[4676]: I1204 15:40:19.834323 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.150892 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.517407 4676 generic.go:334] "Generic (PLEG): container finished" podID="40ae722b-54ad-4066-b690-5639be42c4f7" containerID="d283cd29bebe9125919aa14c5070366b65637553c2c29c614f873042dfd3c923" exitCode=0 Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.517720 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" event={"ID":"40ae722b-54ad-4066-b690-5639be42c4f7","Type":"ContainerDied","Data":"d283cd29bebe9125919aa14c5070366b65637553c2c29c614f873042dfd3c923"} Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.517756 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" event={"ID":"40ae722b-54ad-4066-b690-5639be42c4f7","Type":"ContainerDied","Data":"d1f52e58125da086b8a975e8c1725a4d0a00e45c0fb65ef8a1228d125c8c5c68"} Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.517769 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1f52e58125da086b8a975e8c1725a4d0a00e45c0fb65ef8a1228d125c8c5c68" Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.527064 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a2aa236-e94e-423d-b8ab-debb9206b6ae","Type":"ContainerStarted","Data":"3373e88d2e9a0b131da0c8b5d27347c7afddacaf3c53de13dfb3a4ac64aa9b0c"} Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.527292 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.531596 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" event={"ID":"5d8c669b-28cb-4230-9425-671d7d330d89","Type":"ContainerStarted","Data":"a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4"} Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.542045 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.542187 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" containerID="cri-o://f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c" gracePeriod=30 Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.542280 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b8700a65-1419-4467-8d99-2085481c5890","Type":"ContainerStarted","Data":"8c3480e0bce96050d61f9e82d1b99f7a2114580e647ca30930fae75f9837fa67"} Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.543236 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api" containerID="cri-o://c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b" gracePeriod=30 Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.569948 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": EOF" Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.570225 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": EOF" Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.666632 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-sb\") pod \"40ae722b-54ad-4066-b690-5639be42c4f7\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.666717 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-config\") pod \"40ae722b-54ad-4066-b690-5639be42c4f7\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.666893 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-svc\") pod \"40ae722b-54ad-4066-b690-5639be42c4f7\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.667019 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg9cd\" (UniqueName: \"kubernetes.io/projected/40ae722b-54ad-4066-b690-5639be42c4f7-kube-api-access-pg9cd\") pod \"40ae722b-54ad-4066-b690-5639be42c4f7\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.667057 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-swift-storage-0\") pod \"40ae722b-54ad-4066-b690-5639be42c4f7\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.667088 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-nb\") pod \"40ae722b-54ad-4066-b690-5639be42c4f7\" (UID: \"40ae722b-54ad-4066-b690-5639be42c4f7\") " Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.696169 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ae722b-54ad-4066-b690-5639be42c4f7-kube-api-access-pg9cd" (OuterVolumeSpecName: "kube-api-access-pg9cd") pod "40ae722b-54ad-4066-b690-5639be42c4f7" (UID: "40ae722b-54ad-4066-b690-5639be42c4f7"). InnerVolumeSpecName "kube-api-access-pg9cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:20 crc kubenswrapper[4676]: I1204 15:40:20.771524 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg9cd\" (UniqueName: \"kubernetes.io/projected/40ae722b-54ad-4066-b690-5639be42c4f7-kube-api-access-pg9cd\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.037801 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.085418 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-config" (OuterVolumeSpecName: "config") pod "40ae722b-54ad-4066-b690-5639be42c4f7" (UID: "40ae722b-54ad-4066-b690-5639be42c4f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.085947 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "40ae722b-54ad-4066-b690-5639be42c4f7" (UID: "40ae722b-54ad-4066-b690-5639be42c4f7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.098921 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.098951 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.146557 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40ae722b-54ad-4066-b690-5639be42c4f7" (UID: "40ae722b-54ad-4066-b690-5639be42c4f7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.160439 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "40ae722b-54ad-4066-b690-5639be42c4f7" (UID: "40ae722b-54ad-4066-b690-5639be42c4f7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.160639 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.202398 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.202436 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.216757 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40ae722b-54ad-4066-b690-5639be42c4f7" (UID: "40ae722b-54ad-4066-b690-5639be42c4f7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.270516 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78c887c44-wcq82" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.305668 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40ae722b-54ad-4066-b690-5639be42c4f7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.566591 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a2aa236-e94e-423d-b8ab-debb9206b6ae","Type":"ContainerStarted","Data":"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737"} Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.569289 4676 generic.go:334] "Generic (PLEG): container finished" podID="5d8c669b-28cb-4230-9425-671d7d330d89" containerID="a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4" exitCode=0 Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.569337 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" event={"ID":"5d8c669b-28cb-4230-9425-671d7d330d89","Type":"ContainerDied","Data":"a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4"} Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.572819 4676 generic.go:334] "Generic (PLEG): container finished" podID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerID="f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c" exitCode=143 Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.572873 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbb65f7b4-kp2f2" event={"ID":"eea76c68-cc6a-4647-af40-c0ebf21b1226","Type":"ContainerDied","Data":"f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c"} Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.589693 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6845bf8cdc-5xmc9" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.590560 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"b8700a65-1419-4467-8d99-2085481c5890","Type":"ContainerStarted","Data":"1073d4d2f43e3e5417cd5e483bddbbcba9f4705a73b5e42f91bf01e75324e8e8"} Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.732656 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=3.732631829 podStartE2EDuration="3.732631829s" podCreationTimestamp="2025-12-04 15:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:21.622383169 +0000 UTC m=+1229.057053026" watchObservedRunningTime="2025-12-04 15:40:21.732631829 +0000 UTC m=+1229.167301676" Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.737481 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6845bf8cdc-5xmc9"] Dec 04 15:40:21 crc kubenswrapper[4676]: I1204 15:40:21.747787 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6845bf8cdc-5xmc9"] Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.611816 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"468634de-1454-48d2-9a70-d9f9ac450550","Type":"ContainerStarted","Data":"55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485"} Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.633082 4676 generic.go:334] "Generic (PLEG): container finished" podID="6cfbf976-db77-44d0-9a80-83648d806eea" containerID="79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860" exitCode=0 Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.633145 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerDied","Data":"79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860"} Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.643110 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" event={"ID":"5d8c669b-28cb-4230-9425-671d7d330d89","Type":"ContainerStarted","Data":"e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f"} Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.643414 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.681270 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" podStartSLOduration=5.6812481980000005 podStartE2EDuration="5.681248198s" podCreationTimestamp="2025-12-04 15:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:22.672033529 +0000 UTC m=+1230.106703376" watchObservedRunningTime="2025-12-04 15:40:22.681248198 +0000 UTC m=+1230.115918065" Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.901491 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:22 crc kubenswrapper[4676]: I1204 15:40:22.939492 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.400335 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40ae722b-54ad-4066-b690-5639be42c4f7" path="/var/lib/kubelet/pods/40ae722b-54ad-4066-b690-5639be42c4f7/volumes" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.652786 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pksjc" event={"ID":"3ac7518d-e354-42a9-85e4-766e455bf838","Type":"ContainerStarted","Data":"4c5f5c531c8768d6c4f1b6ff429a5e703561b00edafe069c4fb0c705f96d59cc"} Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.655254 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"468634de-1454-48d2-9a70-d9f9ac450550","Type":"ContainerStarted","Data":"6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc"} Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.658437 4676 generic.go:334] "Generic (PLEG): container finished" podID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerID="3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710" exitCode=1 Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.658779 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerDied","Data":"3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710"} Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.659120 4676 scope.go:117] "RemoveContainer" containerID="d3dce1564d44980c735df7f44391fc16dd797b13c0b45bfcf54a92cd9508f17d" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.660627 4676 scope.go:117] "RemoveContainer" containerID="3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710" Dec 04 15:40:23 crc kubenswrapper[4676]: E1204 15:40:23.660961 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.668815 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api-log" containerID="cri-o://9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737" gracePeriod=30 Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.669082 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a2aa236-e94e-423d-b8ab-debb9206b6ae","Type":"ContainerStarted","Data":"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7"} Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.669121 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.669180 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api" containerID="cri-o://caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7" gracePeriod=30 Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.712243 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:35966->10.217.0.168:9311: read: connection reset by peer" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.712297 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:35980->10.217.0.168:9311: read: connection reset by peer" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.713398 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bbb65f7b4-kp2f2" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": dial tcp 10.217.0.168:9311: connect: connection refused" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.732006 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pksjc" podStartSLOduration=28.247633365 podStartE2EDuration="1m7.731981139s" podCreationTimestamp="2025-12-04 15:39:16 +0000 UTC" firstStartedPulling="2025-12-04 15:39:42.976696635 +0000 UTC m=+1190.411366492" lastFinishedPulling="2025-12-04 15:40:22.461044409 +0000 UTC m=+1229.895714266" observedRunningTime="2025-12-04 15:40:23.68337596 +0000 UTC m=+1231.118045837" watchObservedRunningTime="2025-12-04 15:40:23.731981139 +0000 UTC m=+1231.166651016" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.747487 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.8210439 podStartE2EDuration="6.747467092s" podCreationTimestamp="2025-12-04 15:40:17 +0000 UTC" firstStartedPulling="2025-12-04 15:40:18.925652384 +0000 UTC m=+1226.360322241" lastFinishedPulling="2025-12-04 15:40:19.852075576 +0000 UTC m=+1227.286745433" observedRunningTime="2025-12-04 15:40:23.743206857 +0000 UTC m=+1231.177876714" watchObservedRunningTime="2025-12-04 15:40:23.747467092 +0000 UTC m=+1231.182136949" Dec 04 15:40:23 crc kubenswrapper[4676]: I1204 15:40:23.749997 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.749990075 podStartE2EDuration="6.749990075s" podCreationTimestamp="2025-12-04 15:40:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:23.727139788 +0000 UTC m=+1231.161809635" watchObservedRunningTime="2025-12-04 15:40:23.749990075 +0000 UTC m=+1231.184659932" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.074075 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.182130 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.292828 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data-custom\") pod \"eea76c68-cc6a-4647-af40-c0ebf21b1226\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.293225 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data\") pod \"eea76c68-cc6a-4647-af40-c0ebf21b1226\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.293333 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-266bq\" (UniqueName: \"kubernetes.io/projected/eea76c68-cc6a-4647-af40-c0ebf21b1226-kube-api-access-266bq\") pod \"eea76c68-cc6a-4647-af40-c0ebf21b1226\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.293483 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-combined-ca-bundle\") pod \"eea76c68-cc6a-4647-af40-c0ebf21b1226\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.293553 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea76c68-cc6a-4647-af40-c0ebf21b1226-logs\") pod \"eea76c68-cc6a-4647-af40-c0ebf21b1226\" (UID: \"eea76c68-cc6a-4647-af40-c0ebf21b1226\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.295083 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eea76c68-cc6a-4647-af40-c0ebf21b1226-logs" (OuterVolumeSpecName: "logs") pod "eea76c68-cc6a-4647-af40-c0ebf21b1226" (UID: "eea76c68-cc6a-4647-af40-c0ebf21b1226"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.300165 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eea76c68-cc6a-4647-af40-c0ebf21b1226-kube-api-access-266bq" (OuterVolumeSpecName: "kube-api-access-266bq") pod "eea76c68-cc6a-4647-af40-c0ebf21b1226" (UID: "eea76c68-cc6a-4647-af40-c0ebf21b1226"). InnerVolumeSpecName "kube-api-access-266bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.304441 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eea76c68-cc6a-4647-af40-c0ebf21b1226" (UID: "eea76c68-cc6a-4647-af40-c0ebf21b1226"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.324711 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eea76c68-cc6a-4647-af40-c0ebf21b1226" (UID: "eea76c68-cc6a-4647-af40-c0ebf21b1226"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.372733 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data" (OuterVolumeSpecName: "config-data") pod "eea76c68-cc6a-4647-af40-c0ebf21b1226" (UID: "eea76c68-cc6a-4647-af40-c0ebf21b1226"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.378682 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.399517 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.399554 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea76c68-cc6a-4647-af40-c0ebf21b1226-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.399564 4676 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.399572 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea76c68-cc6a-4647-af40-c0ebf21b1226-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.399581 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-266bq\" (UniqueName: \"kubernetes.io/projected/eea76c68-cc6a-4647-af40-c0ebf21b1226-kube-api-access-266bq\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.500363 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data\") pod \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.500475 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-scripts\") pod \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.500598 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data-custom\") pod \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.500666 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a2aa236-e94e-423d-b8ab-debb9206b6ae-etc-machine-id\") pod \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.500775 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkxjx\" (UniqueName: \"kubernetes.io/projected/6a2aa236-e94e-423d-b8ab-debb9206b6ae-kube-api-access-dkxjx\") pod \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.500830 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-combined-ca-bundle\") pod \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.500929 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2aa236-e94e-423d-b8ab-debb9206b6ae-logs\") pod \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\" (UID: \"6a2aa236-e94e-423d-b8ab-debb9206b6ae\") " Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.501372 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6a2aa236-e94e-423d-b8ab-debb9206b6ae-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6a2aa236-e94e-423d-b8ab-debb9206b6ae" (UID: "6a2aa236-e94e-423d-b8ab-debb9206b6ae"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.501573 4676 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6a2aa236-e94e-423d-b8ab-debb9206b6ae-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.502072 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a2aa236-e94e-423d-b8ab-debb9206b6ae-logs" (OuterVolumeSpecName: "logs") pod "6a2aa236-e94e-423d-b8ab-debb9206b6ae" (UID: "6a2aa236-e94e-423d-b8ab-debb9206b6ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.505014 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-scripts" (OuterVolumeSpecName: "scripts") pod "6a2aa236-e94e-423d-b8ab-debb9206b6ae" (UID: "6a2aa236-e94e-423d-b8ab-debb9206b6ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.506254 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a2aa236-e94e-423d-b8ab-debb9206b6ae-kube-api-access-dkxjx" (OuterVolumeSpecName: "kube-api-access-dkxjx") pod "6a2aa236-e94e-423d-b8ab-debb9206b6ae" (UID: "6a2aa236-e94e-423d-b8ab-debb9206b6ae"). InnerVolumeSpecName "kube-api-access-dkxjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.506613 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6a2aa236-e94e-423d-b8ab-debb9206b6ae" (UID: "6a2aa236-e94e-423d-b8ab-debb9206b6ae"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.536826 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a2aa236-e94e-423d-b8ab-debb9206b6ae" (UID: "6a2aa236-e94e-423d-b8ab-debb9206b6ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.570477 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data" (OuterVolumeSpecName: "config-data") pod "6a2aa236-e94e-423d-b8ab-debb9206b6ae" (UID: "6a2aa236-e94e-423d-b8ab-debb9206b6ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.606467 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.606539 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.606553 4676 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.606599 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkxjx\" (UniqueName: \"kubernetes.io/projected/6a2aa236-e94e-423d-b8ab-debb9206b6ae-kube-api-access-dkxjx\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.606616 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a2aa236-e94e-423d-b8ab-debb9206b6ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.606629 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a2aa236-e94e-423d-b8ab-debb9206b6ae-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.700933 4676 generic.go:334] "Generic (PLEG): container finished" podID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerID="c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b" exitCode=0 Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.701184 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bbb65f7b4-kp2f2" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.701174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbb65f7b4-kp2f2" event={"ID":"eea76c68-cc6a-4647-af40-c0ebf21b1226","Type":"ContainerDied","Data":"c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b"} Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.701405 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bbb65f7b4-kp2f2" event={"ID":"eea76c68-cc6a-4647-af40-c0ebf21b1226","Type":"ContainerDied","Data":"db8f43b4e80885e576533d60eef5e5c0270919937af6e618c67eeed395ad0e37"} Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.701425 4676 scope.go:117] "RemoveContainer" containerID="c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.710139 4676 scope.go:117] "RemoveContainer" containerID="3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.710741 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.718453 4676 generic.go:334] "Generic (PLEG): container finished" podID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerID="caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7" exitCode=0 Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.718488 4676 generic.go:334] "Generic (PLEG): container finished" podID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerID="9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737" exitCode=143 Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.718719 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a2aa236-e94e-423d-b8ab-debb9206b6ae","Type":"ContainerDied","Data":"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7"} Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.718789 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a2aa236-e94e-423d-b8ab-debb9206b6ae","Type":"ContainerDied","Data":"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737"} Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.718807 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"6a2aa236-e94e-423d-b8ab-debb9206b6ae","Type":"ContainerDied","Data":"3373e88d2e9a0b131da0c8b5d27347c7afddacaf3c53de13dfb3a4ac64aa9b0c"} Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.718882 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.772337 4676 scope.go:117] "RemoveContainer" containerID="f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.782712 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.805855 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.815401 4676 scope.go:117] "RemoveContainer" containerID="c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.815953 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b\": container with ID starting with c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b not found: ID does not exist" containerID="c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.816009 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b"} err="failed to get container status \"c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b\": rpc error: code = NotFound desc = could not find container \"c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b\": container with ID starting with c625a17786362446edbfb8b2b109b3a243e2abdf550ea2869d99439a7e99eb7b not found: ID does not exist" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.816040 4676 scope.go:117] "RemoveContainer" containerID="f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.816383 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c\": container with ID starting with f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c not found: ID does not exist" containerID="f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.816417 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c"} err="failed to get container status \"f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c\": rpc error: code = NotFound desc = could not find container \"f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c\": container with ID starting with f1cf864707337f8914d1dbb7af3cc866f416f57bb9763e21ad85714b42bfb55c not found: ID does not exist" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.816445 4676 scope.go:117] "RemoveContainer" containerID="caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.829929 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bbb65f7b4-kp2f2"] Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.841057 4676 scope.go:117] "RemoveContainer" containerID="9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.849012 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-bbb65f7b4-kp2f2"] Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.860810 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.861240 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861256 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.861274 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ae722b-54ad-4066-b690-5639be42c4f7" containerName="init" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861280 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ae722b-54ad-4066-b690-5639be42c4f7" containerName="init" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.861290 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api-log" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861297 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api-log" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.861315 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861322 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.861333 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861339 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.861352 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ae722b-54ad-4066-b690-5639be42c4f7" containerName="dnsmasq-dns" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861357 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ae722b-54ad-4066-b690-5639be42c4f7" containerName="dnsmasq-dns" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861536 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861548 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ae722b-54ad-4066-b690-5639be42c4f7" containerName="dnsmasq-dns" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861565 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" containerName="cinder-api-log" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861578 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.861586 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" containerName="barbican-api-log" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.862732 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.868164 4676 scope.go:117] "RemoveContainer" containerID="caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.868584 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.868604 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.868879 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.870268 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7\": container with ID starting with caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7 not found: ID does not exist" containerID="caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.870313 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7"} err="failed to get container status \"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7\": rpc error: code = NotFound desc = could not find container \"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7\": container with ID starting with caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7 not found: ID does not exist" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.870346 4676 scope.go:117] "RemoveContainer" containerID="9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737" Dec 04 15:40:24 crc kubenswrapper[4676]: E1204 15:40:24.870708 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737\": container with ID starting with 9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737 not found: ID does not exist" containerID="9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.870810 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737"} err="failed to get container status \"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737\": rpc error: code = NotFound desc = could not find container \"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737\": container with ID starting with 9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737 not found: ID does not exist" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.870853 4676 scope.go:117] "RemoveContainer" containerID="caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.871261 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7"} err="failed to get container status \"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7\": rpc error: code = NotFound desc = could not find container \"caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7\": container with ID starting with caf219c95d73aff33f2cb0d79c72ae15b0704fdfafb5ed2945f7e225b75f26c7 not found: ID does not exist" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.871302 4676 scope.go:117] "RemoveContainer" containerID="9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.871661 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737"} err="failed to get container status \"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737\": rpc error: code = NotFound desc = could not find container \"9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737\": container with ID starting with 9116ef2efc3ed786529e5bd446e4d6e708e04f4188a00358c229c1fa81ca4737 not found: ID does not exist" Dec 04 15:40:24 crc kubenswrapper[4676]: I1204 15:40:24.892170 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026091 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026188 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-scripts\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026206 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026231 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026456 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026585 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-config-data\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026646 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ccb2a9-3d12-4899-bae2-618d80e5167c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026696 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dss59\" (UniqueName: \"kubernetes.io/projected/d9ccb2a9-3d12-4899-bae2-618d80e5167c-kube-api-access-dss59\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.026792 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ccb2a9-3d12-4899-bae2-618d80e5167c-logs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128154 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dss59\" (UniqueName: \"kubernetes.io/projected/d9ccb2a9-3d12-4899-bae2-618d80e5167c-kube-api-access-dss59\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128237 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ccb2a9-3d12-4899-bae2-618d80e5167c-logs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128263 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128308 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-scripts\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128326 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128350 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128410 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128432 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-config-data\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128460 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ccb2a9-3d12-4899-bae2-618d80e5167c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128536 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d9ccb2a9-3d12-4899-bae2-618d80e5167c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.128732 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9ccb2a9-3d12-4899-bae2-618d80e5167c-logs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.133282 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.133828 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.134416 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.135559 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-scripts\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.136104 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-config-data\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.139661 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9ccb2a9-3d12-4899-bae2-618d80e5167c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.147849 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dss59\" (UniqueName: \"kubernetes.io/projected/d9ccb2a9-3d12-4899-bae2-618d80e5167c-kube-api-access-dss59\") pod \"cinder-api-0\" (UID: \"d9ccb2a9-3d12-4899-bae2-618d80e5167c\") " pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.192488 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.399630 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a2aa236-e94e-423d-b8ab-debb9206b6ae" path="/var/lib/kubelet/pods/6a2aa236-e94e-423d-b8ab-debb9206b6ae/volumes" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.400843 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eea76c68-cc6a-4647-af40-c0ebf21b1226" path="/var/lib/kubelet/pods/eea76c68-cc6a-4647-af40-c0ebf21b1226/volumes" Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.635256 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 04 15:40:25 crc kubenswrapper[4676]: W1204 15:40:25.641181 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9ccb2a9_3d12_4899_bae2_618d80e5167c.slice/crio-b15f76841467c9b4eb3cf70398f8fabb8d625ac47af7cd028c1d9d0feb82475c WatchSource:0}: Error finding container b15f76841467c9b4eb3cf70398f8fabb8d625ac47af7cd028c1d9d0feb82475c: Status 404 returned error can't find the container with id b15f76841467c9b4eb3cf70398f8fabb8d625ac47af7cd028c1d9d0feb82475c Dec 04 15:40:25 crc kubenswrapper[4676]: I1204 15:40:25.731349 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9ccb2a9-3d12-4899-bae2-618d80e5167c","Type":"ContainerStarted","Data":"b15f76841467c9b4eb3cf70398f8fabb8d625ac47af7cd028c1d9d0feb82475c"} Dec 04 15:40:26 crc kubenswrapper[4676]: I1204 15:40:26.746109 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9ccb2a9-3d12-4899-bae2-618d80e5167c","Type":"ContainerStarted","Data":"7a4342384d9422ff62ee0aaede11fd33ce81907c88bf6091f9bd157c86eca465"} Dec 04 15:40:26 crc kubenswrapper[4676]: I1204 15:40:26.746628 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 04 15:40:26 crc kubenswrapper[4676]: I1204 15:40:26.746639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d9ccb2a9-3d12-4899-bae2-618d80e5167c","Type":"ContainerStarted","Data":"aff0809a370336ae4f6f4137a7010dd0c3af9dba2be4f3a4d9f7794a35690b5c"} Dec 04 15:40:26 crc kubenswrapper[4676]: I1204 15:40:26.780098 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.780070343 podStartE2EDuration="2.780070343s" podCreationTimestamp="2025-12-04 15:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:26.771566505 +0000 UTC m=+1234.206236362" watchObservedRunningTime="2025-12-04 15:40:26.780070343 +0000 UTC m=+1234.214740200" Dec 04 15:40:27 crc kubenswrapper[4676]: I1204 15:40:27.201208 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Dec 04 15:40:27 crc kubenswrapper[4676]: I1204 15:40:27.207750 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Dec 04 15:40:27 crc kubenswrapper[4676]: I1204 15:40:27.461429 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-75f9dc548b-ctwhb" Dec 04 15:40:27 crc kubenswrapper[4676]: I1204 15:40:27.836282 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 15:40:27 crc kubenswrapper[4676]: I1204 15:40:27.919129 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:27 crc kubenswrapper[4676]: I1204 15:40:27.999171 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9df8fb6c-mjt7v"] Dec 04 15:40:27 crc kubenswrapper[4676]: I1204 15:40:27.999585 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" podUID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerName="dnsmasq-dns" containerID="cri-o://dc83a9536d95eb7a61acfb9801cd70f396a2071a9fa4018f6c2f5123c8cbb9d0" gracePeriod=10 Dec 04 15:40:28 crc kubenswrapper[4676]: I1204 15:40:28.088219 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 15:40:28 crc kubenswrapper[4676]: I1204 15:40:28.795868 4676 generic.go:334] "Generic (PLEG): container finished" podID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerID="dc83a9536d95eb7a61acfb9801cd70f396a2071a9fa4018f6c2f5123c8cbb9d0" exitCode=0 Dec 04 15:40:28 crc kubenswrapper[4676]: I1204 15:40:28.797369 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" event={"ID":"3ed7fb0d-bb13-44f2-9e12-fe5829c660af","Type":"ContainerDied","Data":"dc83a9536d95eb7a61acfb9801cd70f396a2071a9fa4018f6c2f5123c8cbb9d0"} Dec 04 15:40:28 crc kubenswrapper[4676]: I1204 15:40:28.859218 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:28 crc kubenswrapper[4676]: I1204 15:40:28.899861 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.019360 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-swift-storage-0\") pod \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.019504 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-nb\") pod \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.019594 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-sb\") pod \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.019620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-svc\") pod \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.019671 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-config\") pod \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.019705 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvk7z\" (UniqueName: \"kubernetes.io/projected/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-kube-api-access-rvk7z\") pod \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\" (UID: \"3ed7fb0d-bb13-44f2-9e12-fe5829c660af\") " Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.026317 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-kube-api-access-rvk7z" (OuterVolumeSpecName: "kube-api-access-rvk7z") pod "3ed7fb0d-bb13-44f2-9e12-fe5829c660af" (UID: "3ed7fb0d-bb13-44f2-9e12-fe5829c660af"). InnerVolumeSpecName "kube-api-access-rvk7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.073723 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.076271 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3ed7fb0d-bb13-44f2-9e12-fe5829c660af" (UID: "3ed7fb0d-bb13-44f2-9e12-fe5829c660af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.078798 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3ed7fb0d-bb13-44f2-9e12-fe5829c660af" (UID: "3ed7fb0d-bb13-44f2-9e12-fe5829c660af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.084018 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-config" (OuterVolumeSpecName: "config") pod "3ed7fb0d-bb13-44f2-9e12-fe5829c660af" (UID: "3ed7fb0d-bb13-44f2-9e12-fe5829c660af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.093970 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3ed7fb0d-bb13-44f2-9e12-fe5829c660af" (UID: "3ed7fb0d-bb13-44f2-9e12-fe5829c660af"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.101134 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3ed7fb0d-bb13-44f2-9e12-fe5829c660af" (UID: "3ed7fb0d-bb13-44f2-9e12-fe5829c660af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.115072 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.121895 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.122031 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.122044 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.122056 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvk7z\" (UniqueName: \"kubernetes.io/projected/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-kube-api-access-rvk7z\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.122070 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.122080 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3ed7fb0d-bb13-44f2-9e12-fe5829c660af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.807701 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.807945 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9df8fb6c-mjt7v" event={"ID":"3ed7fb0d-bb13-44f2-9e12-fe5829c660af","Type":"ContainerDied","Data":"31c6ceb3936a713da96f40d993d079b8ecec8545890b51a5cb2f5a4156771764"} Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.808019 4676 scope.go:117] "RemoveContainer" containerID="dc83a9536d95eb7a61acfb9801cd70f396a2071a9fa4018f6c2f5123c8cbb9d0" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.809592 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="cinder-scheduler" containerID="cri-o://55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485" gracePeriod=30 Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.809710 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="probe" containerID="cri-o://6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc" gracePeriod=30 Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.843257 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9df8fb6c-mjt7v"] Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.850049 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.851416 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b9df8fb6c-mjt7v"] Dec 04 15:40:29 crc kubenswrapper[4676]: I1204 15:40:29.855294 4676 scope.go:117] "RemoveContainer" containerID="75822f86c899f749943237accfe41642dd9812954e385868cb951c2a685ef826" Dec 04 15:40:30 crc kubenswrapper[4676]: I1204 15:40:30.235335 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-97885899c-28t7l" Dec 04 15:40:30 crc kubenswrapper[4676]: I1204 15:40:30.821002 4676 generic.go:334] "Generic (PLEG): container finished" podID="468634de-1454-48d2-9a70-d9f9ac450550" containerID="6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc" exitCode=0 Dec 04 15:40:30 crc kubenswrapper[4676]: I1204 15:40:30.821051 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"468634de-1454-48d2-9a70-d9f9ac450550","Type":"ContainerDied","Data":"6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc"} Dec 04 15:40:31 crc kubenswrapper[4676]: I1204 15:40:31.270973 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-78c887c44-wcq82" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.156:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.156:8443: connect: connection refused" Dec 04 15:40:31 crc kubenswrapper[4676]: I1204 15:40:31.420243 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" path="/var/lib/kubelet/pods/3ed7fb0d-bb13-44f2-9e12-fe5829c660af/volumes" Dec 04 15:40:31 crc kubenswrapper[4676]: I1204 15:40:31.834073 4676 generic.go:334] "Generic (PLEG): container finished" podID="1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" containerID="3824cedf3821404ecaa93361a03f6ca90e326fcb663133d0b9765ae49aef9e60" exitCode=0 Dec 04 15:40:31 crc kubenswrapper[4676]: I1204 15:40:31.834149 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mxcxz" event={"ID":"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997","Type":"ContainerDied","Data":"3824cedf3821404ecaa93361a03f6ca90e326fcb663133d0b9765ae49aef9e60"} Dec 04 15:40:31 crc kubenswrapper[4676]: I1204 15:40:31.880011 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.844740 4676 generic.go:334] "Generic (PLEG): container finished" podID="3ac7518d-e354-42a9-85e4-766e455bf838" containerID="4c5f5c531c8768d6c4f1b6ff429a5e703561b00edafe069c4fb0c705f96d59cc" exitCode=0 Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.844842 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pksjc" event={"ID":"3ac7518d-e354-42a9-85e4-766e455bf838","Type":"ContainerDied","Data":"4c5f5c531c8768d6c4f1b6ff429a5e703561b00edafe069c4fb0c705f96d59cc"} Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.900642 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.900711 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.900724 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.901458 4676 scope.go:117] "RemoveContainer" containerID="3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710" Dec 04 15:40:32 crc kubenswrapper[4676]: E1204 15:40:32.901730 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.937083 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 15:40:32 crc kubenswrapper[4676]: E1204 15:40:32.937492 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerName="init" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.937511 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerName="init" Dec 04 15:40:32 crc kubenswrapper[4676]: E1204 15:40:32.937533 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerName="dnsmasq-dns" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.937542 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerName="dnsmasq-dns" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.937753 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed7fb0d-bb13-44f2-9e12-fe5829c660af" containerName="dnsmasq-dns" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.938422 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.940707 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7zfnq" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.941682 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.942082 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 04 15:40:32 crc kubenswrapper[4676]: I1204 15:40:32.962683 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.094110 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmxqc\" (UniqueName: \"kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.094592 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.094624 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config-secret\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.095190 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.160615 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 04 15:40:33 crc kubenswrapper[4676]: E1204 15:40:33.161973 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-xmxqc openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="0003ead5-ca98-4a2d-a84f-8dd9815a07db" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.191454 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.200033 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.200086 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmxqc\" (UniqueName: \"kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.200117 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.200141 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config-secret\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.201977 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.206384 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config-secret\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.207208 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: E1204 15:40:33.216080 4676 projected.go:194] Error preparing data for projected volume kube-api-access-xmxqc for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 04 15:40:33 crc kubenswrapper[4676]: E1204 15:40:33.216171 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc podName:0003ead5-ca98-4a2d-a84f-8dd9815a07db nodeName:}" failed. No retries permitted until 2025-12-04 15:40:33.716141498 +0000 UTC m=+1241.150811355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xmxqc" (UniqueName: "kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc") pod "openstackclient" (UID: "0003ead5-ca98-4a2d-a84f-8dd9815a07db") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.256054 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.257654 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.286318 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.346389 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.405148 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da921c96-bdd0-4aa2-a98e-9adc22788b75-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.405443 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da921c96-bdd0-4aa2-a98e-9adc22788b75-openstack-config\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.405615 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fl6z\" (UniqueName: \"kubernetes.io/projected/da921c96-bdd0-4aa2-a98e-9adc22788b75-kube-api-access-4fl6z\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.405666 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da921c96-bdd0-4aa2-a98e-9adc22788b75-openstack-config-secret\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.506477 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-config\") pod \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.506567 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-combined-ca-bundle\") pod \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.506732 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmxsp\" (UniqueName: \"kubernetes.io/projected/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-kube-api-access-gmxsp\") pod \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\" (UID: \"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.507185 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fl6z\" (UniqueName: \"kubernetes.io/projected/da921c96-bdd0-4aa2-a98e-9adc22788b75-kube-api-access-4fl6z\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.507239 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da921c96-bdd0-4aa2-a98e-9adc22788b75-openstack-config-secret\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.507322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da921c96-bdd0-4aa2-a98e-9adc22788b75-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.507366 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da921c96-bdd0-4aa2-a98e-9adc22788b75-openstack-config\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.508368 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da921c96-bdd0-4aa2-a98e-9adc22788b75-openstack-config\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.512063 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-kube-api-access-gmxsp" (OuterVolumeSpecName: "kube-api-access-gmxsp") pod "1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" (UID: "1eaff04d-0c2d-4de6-ae7d-e0da6a64f997"). InnerVolumeSpecName "kube-api-access-gmxsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.522347 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da921c96-bdd0-4aa2-a98e-9adc22788b75-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.522776 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da921c96-bdd0-4aa2-a98e-9adc22788b75-openstack-config-secret\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.528452 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fl6z\" (UniqueName: \"kubernetes.io/projected/da921c96-bdd0-4aa2-a98e-9adc22788b75-kube-api-access-4fl6z\") pod \"openstackclient\" (UID: \"da921c96-bdd0-4aa2-a98e-9adc22788b75\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.552216 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" (UID: "1eaff04d-0c2d-4de6-ae7d-e0da6a64f997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.563777 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-config" (OuterVolumeSpecName: "config") pod "1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" (UID: "1eaff04d-0c2d-4de6-ae7d-e0da6a64f997"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.609381 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.609427 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.609441 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmxsp\" (UniqueName: \"kubernetes.io/projected/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997-kube-api-access-gmxsp\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.626053 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.662621 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.710825 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468634de-1454-48d2-9a70-d9f9ac450550-etc-machine-id\") pod \"468634de-1454-48d2-9a70-d9f9ac450550\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.710930 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data\") pod \"468634de-1454-48d2-9a70-d9f9ac450550\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.710980 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-scripts\") pod \"468634de-1454-48d2-9a70-d9f9ac450550\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.710989 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/468634de-1454-48d2-9a70-d9f9ac450550-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "468634de-1454-48d2-9a70-d9f9ac450550" (UID: "468634de-1454-48d2-9a70-d9f9ac450550"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.711099 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data-custom\") pod \"468634de-1454-48d2-9a70-d9f9ac450550\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.711231 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzl8b\" (UniqueName: \"kubernetes.io/projected/468634de-1454-48d2-9a70-d9f9ac450550-kube-api-access-pzl8b\") pod \"468634de-1454-48d2-9a70-d9f9ac450550\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.711316 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-combined-ca-bundle\") pod \"468634de-1454-48d2-9a70-d9f9ac450550\" (UID: \"468634de-1454-48d2-9a70-d9f9ac450550\") " Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.711815 4676 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/468634de-1454-48d2-9a70-d9f9ac450550-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.716064 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-scripts" (OuterVolumeSpecName: "scripts") pod "468634de-1454-48d2-9a70-d9f9ac450550" (UID: "468634de-1454-48d2-9a70-d9f9ac450550"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.716133 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "468634de-1454-48d2-9a70-d9f9ac450550" (UID: "468634de-1454-48d2-9a70-d9f9ac450550"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.718114 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/468634de-1454-48d2-9a70-d9f9ac450550-kube-api-access-pzl8b" (OuterVolumeSpecName: "kube-api-access-pzl8b") pod "468634de-1454-48d2-9a70-d9f9ac450550" (UID: "468634de-1454-48d2-9a70-d9f9ac450550"). InnerVolumeSpecName "kube-api-access-pzl8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.810460 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "468634de-1454-48d2-9a70-d9f9ac450550" (UID: "468634de-1454-48d2-9a70-d9f9ac450550"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.825339 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmxqc\" (UniqueName: \"kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc\") pod \"openstackclient\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.825646 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzl8b\" (UniqueName: \"kubernetes.io/projected/468634de-1454-48d2-9a70-d9f9ac450550-kube-api-access-pzl8b\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.825658 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.825670 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.825680 4676 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: E1204 15:40:33.827769 4676 projected.go:194] Error preparing data for projected volume kube-api-access-xmxqc for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (0003ead5-ca98-4a2d-a84f-8dd9815a07db) does not match the UID in record. The object might have been deleted and then recreated Dec 04 15:40:33 crc kubenswrapper[4676]: E1204 15:40:33.827834 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc podName:0003ead5-ca98-4a2d-a84f-8dd9815a07db nodeName:}" failed. No retries permitted until 2025-12-04 15:40:34.827816458 +0000 UTC m=+1242.262486315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xmxqc" (UniqueName: "kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc") pod "openstackclient" (UID: "0003ead5-ca98-4a2d-a84f-8dd9815a07db") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (0003ead5-ca98-4a2d-a84f-8dd9815a07db) does not match the UID in record. The object might have been deleted and then recreated Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.877703 4676 generic.go:334] "Generic (PLEG): container finished" podID="468634de-1454-48d2-9a70-d9f9ac450550" containerID="55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485" exitCode=0 Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.877829 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"468634de-1454-48d2-9a70-d9f9ac450550","Type":"ContainerDied","Data":"55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485"} Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.877857 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"468634de-1454-48d2-9a70-d9f9ac450550","Type":"ContainerDied","Data":"396d0c789f4b406e6ccee60956c4a758b573217633b4f579f20733c45c9a5562"} Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.877889 4676 scope.go:117] "RemoveContainer" containerID="6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.878103 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.881349 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data" (OuterVolumeSpecName: "config-data") pod "468634de-1454-48d2-9a70-d9f9ac450550" (UID: "468634de-1454-48d2-9a70-d9f9ac450550"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.885272 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.885952 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mxcxz" event={"ID":"1eaff04d-0c2d-4de6-ae7d-e0da6a64f997","Type":"ContainerDied","Data":"be48d26610ec7c0bb1baf5c0ea2e5a3e66bdd9c2f0800eb22dc7143aa4fa1bbb"} Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.886001 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be48d26610ec7c0bb1baf5c0ea2e5a3e66bdd9c2f0800eb22dc7143aa4fa1bbb" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.886067 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mxcxz" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.902974 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.906893 4676 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0003ead5-ca98-4a2d-a84f-8dd9815a07db" podUID="da921c96-bdd0-4aa2-a98e-9adc22788b75" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.920000 4676 scope.go:117] "RemoveContainer" containerID="55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.929690 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/468634de-1454-48d2-9a70-d9f9ac450550-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.951727 4676 scope.go:117] "RemoveContainer" containerID="6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc" Dec 04 15:40:33 crc kubenswrapper[4676]: E1204 15:40:33.954679 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc\": container with ID starting with 6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc not found: ID does not exist" containerID="6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.954719 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc"} err="failed to get container status \"6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc\": rpc error: code = NotFound desc = could not find container \"6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc\": container with ID starting with 6e47f194e4531ea80013f6b76a160335727d5b7ba8f825cc0ab211fa97682cbc not found: ID does not exist" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.954743 4676 scope.go:117] "RemoveContainer" containerID="55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485" Dec 04 15:40:33 crc kubenswrapper[4676]: E1204 15:40:33.957890 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485\": container with ID starting with 55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485 not found: ID does not exist" containerID="55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485" Dec 04 15:40:33 crc kubenswrapper[4676]: I1204 15:40:33.957942 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485"} err="failed to get container status \"55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485\": rpc error: code = NotFound desc = could not find container \"55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485\": container with ID starting with 55815960957deb7ee6633d63a4c5a949b85c465fb8c401b6ef7d77f649830485 not found: ID does not exist" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.034386 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-combined-ca-bundle\") pod \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.034531 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config\") pod \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.034584 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config-secret\") pod \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\" (UID: \"0003ead5-ca98-4a2d-a84f-8dd9815a07db\") " Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.035406 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmxqc\" (UniqueName: \"kubernetes.io/projected/0003ead5-ca98-4a2d-a84f-8dd9815a07db-kube-api-access-xmxqc\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.036530 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0003ead5-ca98-4a2d-a84f-8dd9815a07db" (UID: "0003ead5-ca98-4a2d-a84f-8dd9815a07db"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.040975 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77c5c8855-gnwsl"] Dec 04 15:40:34 crc kubenswrapper[4676]: E1204 15:40:34.044610 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="cinder-scheduler" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.044666 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0003ead5-ca98-4a2d-a84f-8dd9815a07db" (UID: "0003ead5-ca98-4a2d-a84f-8dd9815a07db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.045200 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0003ead5-ca98-4a2d-a84f-8dd9815a07db" (UID: "0003ead5-ca98-4a2d-a84f-8dd9815a07db"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.045233 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="cinder-scheduler" Dec 04 15:40:34 crc kubenswrapper[4676]: E1204 15:40:34.045304 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="probe" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.045319 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="probe" Dec 04 15:40:34 crc kubenswrapper[4676]: E1204 15:40:34.045378 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" containerName="neutron-db-sync" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.045387 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" containerName="neutron-db-sync" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.045892 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="probe" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.045941 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" containerName="neutron-db-sync" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.045953 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="468634de-1454-48d2-9a70-d9f9ac450550" containerName="cinder-scheduler" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.047723 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.078141 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c5c8855-gnwsl"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139136 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-sb\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139237 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-config\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139350 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-nb\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139388 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw26m\" (UniqueName: \"kubernetes.io/projected/071cd019-bbb2-4632-a889-73e6f556d45e-kube-api-access-lw26m\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139478 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-swift-storage-0\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139528 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-svc\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139625 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139647 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.139657 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0003ead5-ca98-4a2d-a84f-8dd9815a07db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.190384 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56cc94d674-46bbd"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.194971 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.207596 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sjgpz" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.207787 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.207986 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.208183 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.212348 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.213772 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56cc94d674-46bbd"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.236067 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.242139 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-config\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.242317 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-nb\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.242371 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw26m\" (UniqueName: \"kubernetes.io/projected/071cd019-bbb2-4632-a889-73e6f556d45e-kube-api-access-lw26m\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.242492 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-swift-storage-0\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.242552 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-svc\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.242593 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-sb\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.243302 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-sb\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.243302 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-config\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.243694 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-nb\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.243850 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-swift-storage-0\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.248348 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-svc\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.296717 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw26m\" (UniqueName: \"kubernetes.io/projected/071cd019-bbb2-4632-a889-73e6f556d45e-kube-api-access-lw26m\") pod \"dnsmasq-dns-77c5c8855-gnwsl\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.336947 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.345181 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-combined-ca-bundle\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.345253 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-httpd-config\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.345284 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2w8l\" (UniqueName: \"kubernetes.io/projected/853263fd-fa07-43e9-9855-fc057772d052-kube-api-access-j2w8l\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.345303 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-ovndb-tls-certs\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.345329 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-config\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.378457 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.387778 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.416017 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.417748 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.420357 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.446844 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-config\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.447112 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-combined-ca-bundle\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.447162 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-httpd-config\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.447191 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2w8l\" (UniqueName: \"kubernetes.io/projected/853263fd-fa07-43e9-9855-fc057772d052-kube-api-access-j2w8l\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.447211 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-ovndb-tls-certs\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.457396 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-combined-ca-bundle\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.457983 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-ovndb-tls-certs\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.458007 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.483504 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-config\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.494693 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-httpd-config\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.495444 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2w8l\" (UniqueName: \"kubernetes.io/projected/853263fd-fa07-43e9-9855-fc057772d052-kube-api-access-j2w8l\") pod \"neutron-56cc94d674-46bbd\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.555836 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.555888 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-config-data\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.555960 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.556007 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.557215 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htw4n\" (UniqueName: \"kubernetes.io/projected/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-kube-api-access-htw4n\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.557268 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-scripts\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.601423 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pksjc" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.628980 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.659642 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c86c\" (UniqueName: \"kubernetes.io/projected/3ac7518d-e354-42a9-85e4-766e455bf838-kube-api-access-8c86c\") pod \"3ac7518d-e354-42a9-85e4-766e455bf838\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.659723 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-combined-ca-bundle\") pod \"3ac7518d-e354-42a9-85e4-766e455bf838\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.659944 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-config-data\") pod \"3ac7518d-e354-42a9-85e4-766e455bf838\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.659980 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-db-sync-config-data\") pod \"3ac7518d-e354-42a9-85e4-766e455bf838\" (UID: \"3ac7518d-e354-42a9-85e4-766e455bf838\") " Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.660233 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.660272 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htw4n\" (UniqueName: \"kubernetes.io/projected/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-kube-api-access-htw4n\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.660305 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-scripts\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.660381 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.660407 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-config-data\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.660455 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.660642 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.665957 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.667385 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac7518d-e354-42a9-85e4-766e455bf838-kube-api-access-8c86c" (OuterVolumeSpecName: "kube-api-access-8c86c") pod "3ac7518d-e354-42a9-85e4-766e455bf838" (UID: "3ac7518d-e354-42a9-85e4-766e455bf838"). InnerVolumeSpecName "kube-api-access-8c86c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.670105 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.670207 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-config-data\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.674355 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-scripts\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.675103 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3ac7518d-e354-42a9-85e4-766e455bf838" (UID: "3ac7518d-e354-42a9-85e4-766e455bf838"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.686741 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htw4n\" (UniqueName: \"kubernetes.io/projected/68ff764e-4045-42f0-83c6-b0ab7a4f3d7d-kube-api-access-htw4n\") pod \"cinder-scheduler-0\" (UID: \"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d\") " pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.725786 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ac7518d-e354-42a9-85e4-766e455bf838" (UID: "3ac7518d-e354-42a9-85e4-766e455bf838"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.749885 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.760033 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-config-data" (OuterVolumeSpecName: "config-data") pod "3ac7518d-e354-42a9-85e4-766e455bf838" (UID: "3ac7518d-e354-42a9-85e4-766e455bf838"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.762424 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.762463 4676 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.762482 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c86c\" (UniqueName: \"kubernetes.io/projected/3ac7518d-e354-42a9-85e4-766e455bf838-kube-api-access-8c86c\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.762499 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ac7518d-e354-42a9-85e4-766e455bf838-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.926232 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pksjc" event={"ID":"3ac7518d-e354-42a9-85e4-766e455bf838","Type":"ContainerDied","Data":"09e6adb64cd5040941c8ddb57141b046f50d8dda7cf5e42fb420616d9a8cc64b"} Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.926523 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e6adb64cd5040941c8ddb57141b046f50d8dda7cf5e42fb420616d9a8cc64b" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.926584 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pksjc" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.970492 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 04 15:40:34 crc kubenswrapper[4676]: I1204 15:40:34.970679 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da921c96-bdd0-4aa2-a98e-9adc22788b75","Type":"ContainerStarted","Data":"5925fe372af19ed3d1840b1c009ee90afa62f9ae425152e42a808e357d279d6d"} Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.020939 4676 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="0003ead5-ca98-4a2d-a84f-8dd9815a07db" podUID="da921c96-bdd0-4aa2-a98e-9adc22788b75" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.037437 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c5c8855-gnwsl"] Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.274988 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c5c8855-gnwsl"] Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.333863 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-866f9499b7-bl2lr"] Dec 04 15:40:35 crc kubenswrapper[4676]: E1204 15:40:35.334380 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac7518d-e354-42a9-85e4-766e455bf838" containerName="glance-db-sync" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.334404 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac7518d-e354-42a9-85e4-766e455bf838" containerName="glance-db-sync" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.334599 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac7518d-e354-42a9-85e4-766e455bf838" containerName="glance-db-sync" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.335661 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.373534 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-866f9499b7-bl2lr"] Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.382005 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4f4\" (UniqueName: \"kubernetes.io/projected/5e9cb383-58a8-45a6-86cf-85b52dd3311b-kube-api-access-rh4f4\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.382111 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-swift-storage-0\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.382176 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-svc\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.382206 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-config\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.382285 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-nb\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.382316 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-sb\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.447607 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0003ead5-ca98-4a2d-a84f-8dd9815a07db" path="/var/lib/kubelet/pods/0003ead5-ca98-4a2d-a84f-8dd9815a07db/volumes" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.448729 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="468634de-1454-48d2-9a70-d9f9ac450550" path="/var/lib/kubelet/pods/468634de-1454-48d2-9a70-d9f9ac450550/volumes" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.484773 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-svc\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.484827 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-config\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.484889 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-nb\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.484930 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-sb\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.484958 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4f4\" (UniqueName: \"kubernetes.io/projected/5e9cb383-58a8-45a6-86cf-85b52dd3311b-kube-api-access-rh4f4\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.485022 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-swift-storage-0\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.486578 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-swift-storage-0\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.487493 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-nb\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.487636 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-config\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.487657 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-sb\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.491534 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-svc\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.491992 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56cc94d674-46bbd"] Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.577608 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4f4\" (UniqueName: \"kubernetes.io/projected/5e9cb383-58a8-45a6-86cf-85b52dd3311b-kube-api-access-rh4f4\") pod \"dnsmasq-dns-866f9499b7-bl2lr\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.682237 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.868137 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:35 crc kubenswrapper[4676]: I1204 15:40:35.878537 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.001699 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-combined-ca-bundle\") pod \"f68f12a3-a61b-492b-94e9-4351419cfa7b\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.001800 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f68f12a3-a61b-492b-94e9-4351419cfa7b-logs\") pod \"f68f12a3-a61b-492b-94e9-4351419cfa7b\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.001835 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-scripts\") pod \"f68f12a3-a61b-492b-94e9-4351419cfa7b\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.001866 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-config-data\") pod \"f68f12a3-a61b-492b-94e9-4351419cfa7b\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.001895 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-tls-certs\") pod \"f68f12a3-a61b-492b-94e9-4351419cfa7b\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.001988 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-secret-key\") pod \"f68f12a3-a61b-492b-94e9-4351419cfa7b\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.002076 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xjv2\" (UniqueName: \"kubernetes.io/projected/f68f12a3-a61b-492b-94e9-4351419cfa7b-kube-api-access-8xjv2\") pod \"f68f12a3-a61b-492b-94e9-4351419cfa7b\" (UID: \"f68f12a3-a61b-492b-94e9-4351419cfa7b\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.002831 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f68f12a3-a61b-492b-94e9-4351419cfa7b-logs" (OuterVolumeSpecName: "logs") pod "f68f12a3-a61b-492b-94e9-4351419cfa7b" (UID: "f68f12a3-a61b-492b-94e9-4351419cfa7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.006636 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f68f12a3-a61b-492b-94e9-4351419cfa7b-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.014386 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68f12a3-a61b-492b-94e9-4351419cfa7b-kube-api-access-8xjv2" (OuterVolumeSpecName: "kube-api-access-8xjv2") pod "f68f12a3-a61b-492b-94e9-4351419cfa7b" (UID: "f68f12a3-a61b-492b-94e9-4351419cfa7b"). InnerVolumeSpecName "kube-api-access-8xjv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.021765 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d","Type":"ContainerStarted","Data":"e94b987a0c31efe9758c02dce84efbb02a5a775ab92550a2d562576fc6387f1e"} Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.029267 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f68f12a3-a61b-492b-94e9-4351419cfa7b" (UID: "f68f12a3-a61b-492b-94e9-4351419cfa7b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.032397 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56cc94d674-46bbd" event={"ID":"853263fd-fa07-43e9-9855-fc057772d052","Type":"ContainerStarted","Data":"f33f0e5c98947a33319dc3b7b3de1e8f1dda4691de801ba67da15b581199db76"} Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.045170 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-config-data" (OuterVolumeSpecName: "config-data") pod "f68f12a3-a61b-492b-94e9-4351419cfa7b" (UID: "f68f12a3-a61b-492b-94e9-4351419cfa7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.049874 4676 generic.go:334] "Generic (PLEG): container finished" podID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerID="b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa" exitCode=137 Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.049970 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c887c44-wcq82" event={"ID":"f68f12a3-a61b-492b-94e9-4351419cfa7b","Type":"ContainerDied","Data":"b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa"} Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.050008 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78c887c44-wcq82" event={"ID":"f68f12a3-a61b-492b-94e9-4351419cfa7b","Type":"ContainerDied","Data":"3313912cb7088c955042933763f209091d3fffc4985c84e3a203790365256d22"} Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.050030 4676 scope.go:117] "RemoveContainer" containerID="061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.050159 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78c887c44-wcq82" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.052467 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-scripts" (OuterVolumeSpecName: "scripts") pod "f68f12a3-a61b-492b-94e9-4351419cfa7b" (UID: "f68f12a3-a61b-492b-94e9-4351419cfa7b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.073513 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" event={"ID":"071cd019-bbb2-4632-a889-73e6f556d45e","Type":"ContainerStarted","Data":"5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3"} Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.073770 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" event={"ID":"071cd019-bbb2-4632-a889-73e6f556d45e","Type":"ContainerStarted","Data":"e4a15faf217ff722100a3e6f9287198482a404e5ffb2f6c6edfb01db7d67ffec"} Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.086836 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f68f12a3-a61b-492b-94e9-4351419cfa7b" (UID: "f68f12a3-a61b-492b-94e9-4351419cfa7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.108879 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.108952 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.108966 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f68f12a3-a61b-492b-94e9-4351419cfa7b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.108978 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.108992 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xjv2\" (UniqueName: \"kubernetes.io/projected/f68f12a3-a61b-492b-94e9-4351419cfa7b-kube-api-access-8xjv2\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.168158 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:36 crc kubenswrapper[4676]: E1204 15:40:36.173401 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.173436 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" Dec 04 15:40:36 crc kubenswrapper[4676]: E1204 15:40:36.173472 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon-log" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.173479 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon-log" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.173698 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.173717 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" containerName="horizon-log" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.174950 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.185665 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wtvdx" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.185881 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.186044 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.193073 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "f68f12a3-a61b-492b-94e9-4351419cfa7b" (UID: "f68f12a3-a61b-492b-94e9-4351419cfa7b"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.196296 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.211145 4676 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/f68f12a3-a61b-492b-94e9-4351419cfa7b-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.313094 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-logs\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.313136 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.313188 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.313216 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgwwp\" (UniqueName: \"kubernetes.io/projected/a4680333-6827-4a80-ab35-c031c5cc4272-kube-api-access-fgwwp\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.313270 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.313330 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.313348 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.415620 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-logs\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.417040 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.417291 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.417321 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgwwp\" (UniqueName: \"kubernetes.io/projected/a4680333-6827-4a80-ab35-c031c5cc4272-kube-api-access-fgwwp\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.417405 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.417498 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.417520 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.419649 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-logs\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.419879 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.422930 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.423887 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-78c887c44-wcq82"] Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.425214 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-config-data\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.426549 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.433235 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-scripts\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.440132 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-78c887c44-wcq82"] Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.443567 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgwwp\" (UniqueName: \"kubernetes.io/projected/a4680333-6827-4a80-ab35-c031c5cc4272-kube-api-access-fgwwp\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.490378 4676 scope.go:117] "RemoveContainer" containerID="b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.508963 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.615059 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.616981 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.618692 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.621773 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.641916 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-866f9499b7-bl2lr"] Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.656856 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.666623 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.680983 4676 scope.go:117] "RemoveContainer" containerID="061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940" Dec 04 15:40:36 crc kubenswrapper[4676]: E1204 15:40:36.681570 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940\": container with ID starting with 061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940 not found: ID does not exist" containerID="061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.681605 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940"} err="failed to get container status \"061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940\": rpc error: code = NotFound desc = could not find container \"061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940\": container with ID starting with 061488abcb85ffa212fae6c89cfe9d5eb6536ad8a87e9419a263441bf411e940 not found: ID does not exist" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.681625 4676 scope.go:117] "RemoveContainer" containerID="b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa" Dec 04 15:40:36 crc kubenswrapper[4676]: E1204 15:40:36.681839 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa\": container with ID starting with b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa not found: ID does not exist" containerID="b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.681861 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa"} err="failed to get container status \"b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa\": rpc error: code = NotFound desc = could not find container \"b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa\": container with ID starting with b3e32c7112e1092e0b700d4c638daeb5d7bcb843d9c5b81be98c90a1c35972aa not found: ID does not exist" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.735833 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.737004 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.737189 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.737301 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.737485 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.737651 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54wd8\" (UniqueName: \"kubernetes.io/projected/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-kube-api-access-54wd8\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.738082 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.838962 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-sb\") pod \"071cd019-bbb2-4632-a889-73e6f556d45e\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.838995 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw26m\" (UniqueName: \"kubernetes.io/projected/071cd019-bbb2-4632-a889-73e6f556d45e-kube-api-access-lw26m\") pod \"071cd019-bbb2-4632-a889-73e6f556d45e\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839074 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-config\") pod \"071cd019-bbb2-4632-a889-73e6f556d45e\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839088 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-nb\") pod \"071cd019-bbb2-4632-a889-73e6f556d45e\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839136 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-swift-storage-0\") pod \"071cd019-bbb2-4632-a889-73e6f556d45e\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839180 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-svc\") pod \"071cd019-bbb2-4632-a889-73e6f556d45e\" (UID: \"071cd019-bbb2-4632-a889-73e6f556d45e\") " Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839457 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839754 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839804 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839872 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839931 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.839968 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54wd8\" (UniqueName: \"kubernetes.io/projected/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-kube-api-access-54wd8\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.841218 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.850101 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.850201 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.854267 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.854514 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071cd019-bbb2-4632-a889-73e6f556d45e-kube-api-access-lw26m" (OuterVolumeSpecName: "kube-api-access-lw26m") pod "071cd019-bbb2-4632-a889-73e6f556d45e" (UID: "071cd019-bbb2-4632-a889-73e6f556d45e"). InnerVolumeSpecName "kube-api-access-lw26m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.869087 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.885655 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54wd8\" (UniqueName: \"kubernetes.io/projected/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-kube-api-access-54wd8\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.886222 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "071cd019-bbb2-4632-a889-73e6f556d45e" (UID: "071cd019-bbb2-4632-a889-73e6f556d45e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.894124 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-config" (OuterVolumeSpecName: "config") pod "071cd019-bbb2-4632-a889-73e6f556d45e" (UID: "071cd019-bbb2-4632-a889-73e6f556d45e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.895262 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.920182 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.926982 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "071cd019-bbb2-4632-a889-73e6f556d45e" (UID: "071cd019-bbb2-4632-a889-73e6f556d45e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.942575 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw26m\" (UniqueName: \"kubernetes.io/projected/071cd019-bbb2-4632-a889-73e6f556d45e-kube-api-access-lw26m\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.942607 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.942617 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.942626 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.942871 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "071cd019-bbb2-4632-a889-73e6f556d45e" (UID: "071cd019-bbb2-4632-a889-73e6f556d45e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.961771 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:36 crc kubenswrapper[4676]: I1204 15:40:36.962611 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "071cd019-bbb2-4632-a889-73e6f556d45e" (UID: "071cd019-bbb2-4632-a889-73e6f556d45e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.044048 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.044085 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/071cd019-bbb2-4632-a889-73e6f556d45e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.107055 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" event={"ID":"5e9cb383-58a8-45a6-86cf-85b52dd3311b","Type":"ContainerStarted","Data":"eb5e904a4fc3c5162eacec62c7aefa40a3dedc4ce4b29a9631080459a7f5ca35"} Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.117329 4676 generic.go:334] "Generic (PLEG): container finished" podID="071cd019-bbb2-4632-a889-73e6f556d45e" containerID="5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3" exitCode=0 Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.117494 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" event={"ID":"071cd019-bbb2-4632-a889-73e6f556d45e","Type":"ContainerDied","Data":"5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3"} Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.117534 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" event={"ID":"071cd019-bbb2-4632-a889-73e6f556d45e","Type":"ContainerDied","Data":"e4a15faf217ff722100a3e6f9287198482a404e5ffb2f6c6edfb01db7d67ffec"} Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.117574 4676 scope.go:117] "RemoveContainer" containerID="5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.117773 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c5c8855-gnwsl" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.137071 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56cc94d674-46bbd" event={"ID":"853263fd-fa07-43e9-9855-fc057772d052","Type":"ContainerStarted","Data":"5017307d15c2cd9ba68144ecc2685519cdeaa90c5dd7a5f2078ac59069785e65"} Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.137124 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56cc94d674-46bbd" event={"ID":"853263fd-fa07-43e9-9855-fc057772d052","Type":"ContainerStarted","Data":"c5f600a18abd0588198fbe3de7b1123c4aa5da8776a404ec4155c4f7ce3c9cd7"} Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.138574 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.212702 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56cc94d674-46bbd" podStartSLOduration=3.212675986 podStartE2EDuration="3.212675986s" podCreationTimestamp="2025-12-04 15:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:37.174927353 +0000 UTC m=+1244.609597220" watchObservedRunningTime="2025-12-04 15:40:37.212675986 +0000 UTC m=+1244.647345843" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.292790 4676 scope.go:117] "RemoveContainer" containerID="5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3" Dec 04 15:40:37 crc kubenswrapper[4676]: E1204 15:40:37.305856 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3\": container with ID starting with 5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3 not found: ID does not exist" containerID="5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.305927 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3"} err="failed to get container status \"5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3\": rpc error: code = NotFound desc = could not find container \"5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3\": container with ID starting with 5f16b0099186a4139f81b7359a01dc8588d2f8ecb949d90c4128e26bcf7799e3 not found: ID does not exist" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.524022 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68f12a3-a61b-492b-94e9-4351419cfa7b" path="/var/lib/kubelet/pods/f68f12a3-a61b-492b-94e9-4351419cfa7b/volumes" Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.525095 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.642028 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c5c8855-gnwsl"] Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.668706 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77c5c8855-gnwsl"] Dec 04 15:40:37 crc kubenswrapper[4676]: I1204 15:40:37.883852 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:38 crc kubenswrapper[4676]: I1204 15:40:38.210317 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4680333-6827-4a80-ab35-c031c5cc4272","Type":"ContainerStarted","Data":"61cadc3ad9a73012d27bf8b099badf9231dc5f2e37584b8800af57fbd3f46379"} Dec 04 15:40:38 crc kubenswrapper[4676]: I1204 15:40:38.228023 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b","Type":"ContainerStarted","Data":"186b529dbf1f0c2032a068c742acee85a727a5829b611f582e42f63ed958506f"} Dec 04 15:40:38 crc kubenswrapper[4676]: I1204 15:40:38.245546 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d","Type":"ContainerStarted","Data":"03838910110aac615e65e45735712158f3fabcf02cebf149a74a78220fb7c2ef"} Dec 04 15:40:38 crc kubenswrapper[4676]: I1204 15:40:38.258065 4676 generic.go:334] "Generic (PLEG): container finished" podID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerID="76e8ec3687c595b74a30ee8b2620faaaa2a2ddacd7461b0200f45a4341ebb4de" exitCode=0 Dec 04 15:40:38 crc kubenswrapper[4676]: I1204 15:40:38.259985 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" event={"ID":"5e9cb383-58a8-45a6-86cf-85b52dd3311b","Type":"ContainerDied","Data":"76e8ec3687c595b74a30ee8b2620faaaa2a2ddacd7461b0200f45a4341ebb4de"} Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.425489 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071cd019-bbb2-4632-a889-73e6f556d45e" path="/var/lib/kubelet/pods/071cd019-bbb2-4632-a889-73e6f556d45e/volumes" Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.464458 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" event={"ID":"5e9cb383-58a8-45a6-86cf-85b52dd3311b","Type":"ContainerStarted","Data":"c82492a192734375701e59a66be12946fadc6db4a6f6b952e3ed209ee42a79d2"} Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.464588 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.488925 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4680333-6827-4a80-ab35-c031c5cc4272","Type":"ContainerStarted","Data":"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5"} Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.495002 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b","Type":"ContainerStarted","Data":"21f7ad9ee346535f624fdcaa2ab2c6340d5e35c4bac7ec1499e6d404a31b6393"} Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.507149 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" podStartSLOduration=4.507127324 podStartE2EDuration="4.507127324s" podCreationTimestamp="2025-12-04 15:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:39.490489129 +0000 UTC m=+1246.925158986" watchObservedRunningTime="2025-12-04 15:40:39.507127324 +0000 UTC m=+1246.941797191" Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.509782 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"68ff764e-4045-42f0-83c6-b0ab7a4f3d7d","Type":"ContainerStarted","Data":"ff7abf4c2f5c4c3f5ea62f21313517ff27f97cd53c962e82277520e68a563902"} Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.545652 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.545626299 podStartE2EDuration="5.545626299s" podCreationTimestamp="2025-12-04 15:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:39.538428328 +0000 UTC m=+1246.973098185" watchObservedRunningTime="2025-12-04 15:40:39.545626299 +0000 UTC m=+1246.980296156" Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.726738 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 04 15:40:39 crc kubenswrapper[4676]: I1204 15:40:39.750210 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 04 15:40:40 crc kubenswrapper[4676]: I1204 15:40:40.528258 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4680333-6827-4a80-ab35-c031c5cc4272","Type":"ContainerStarted","Data":"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c"} Dec 04 15:40:40 crc kubenswrapper[4676]: I1204 15:40:40.535076 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b","Type":"ContainerStarted","Data":"57994ab52e6e547774d3808d63d8d475e398cd9385aca9de950aa6a2008e9809"} Dec 04 15:40:40 crc kubenswrapper[4676]: I1204 15:40:40.567927 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.567856347 podStartE2EDuration="5.567856347s" podCreationTimestamp="2025-12-04 15:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:40.54946868 +0000 UTC m=+1247.984138537" watchObservedRunningTime="2025-12-04 15:40:40.567856347 +0000 UTC m=+1248.002526204" Dec 04 15:40:40 crc kubenswrapper[4676]: I1204 15:40:40.794829 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.794779073 podStartE2EDuration="5.794779073s" podCreationTimestamp="2025-12-04 15:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:40.779671482 +0000 UTC m=+1248.214341349" watchObservedRunningTime="2025-12-04 15:40:40.794779073 +0000 UTC m=+1248.229448920" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.062375 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.172678 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.406696 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68bd568fd5-srw6v"] Dec 04 15:40:41 crc kubenswrapper[4676]: E1204 15:40:41.407486 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071cd019-bbb2-4632-a889-73e6f556d45e" containerName="init" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.407541 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="071cd019-bbb2-4632-a889-73e6f556d45e" containerName="init" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.407833 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="071cd019-bbb2-4632-a889-73e6f556d45e" containerName="init" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.409178 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.414411 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.414702 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.416132 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68bd568fd5-srw6v"] Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.468714 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-654gt\" (UniqueName: \"kubernetes.io/projected/5eab48dd-24f7-4439-bcc1-29f34b005bda-kube-api-access-654gt\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.468774 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-internal-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.468808 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-config\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.468888 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-combined-ca-bundle\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.468960 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-public-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.468990 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-httpd-config\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.469028 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-ovndb-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.560474 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-78ffb7b6cf-46b4r"] Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.562186 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.566428 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.566754 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.566985 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.570598 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-public-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.570636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-httpd-config\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.570665 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-ovndb-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.570762 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-654gt\" (UniqueName: \"kubernetes.io/projected/5eab48dd-24f7-4439-bcc1-29f34b005bda-kube-api-access-654gt\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.570782 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-internal-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.570803 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-config\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.570859 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-combined-ca-bundle\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.587699 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-httpd-config\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.592622 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-public-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.602140 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-config\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.607442 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-combined-ca-bundle\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.626505 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-654gt\" (UniqueName: \"kubernetes.io/projected/5eab48dd-24f7-4439-bcc1-29f34b005bda-kube-api-access-654gt\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.627457 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-internal-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.646986 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78ffb7b6cf-46b4r"] Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.647747 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5eab48dd-24f7-4439-bcc1-29f34b005bda-ovndb-tls-certs\") pod \"neutron-68bd568fd5-srw6v\" (UID: \"5eab48dd-24f7-4439-bcc1-29f34b005bda\") " pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.675142 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-public-tls-certs\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.675234 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-internal-tls-certs\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.675306 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/10ac9a17-d069-484c-9f44-baaada4618f8-etc-swift\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.675974 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10ac9a17-d069-484c-9f44-baaada4618f8-run-httpd\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.676126 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2fjj\" (UniqueName: \"kubernetes.io/projected/10ac9a17-d069-484c-9f44-baaada4618f8-kube-api-access-w2fjj\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.676237 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-config-data\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.676474 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-combined-ca-bundle\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.676572 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10ac9a17-d069-484c-9f44-baaada4618f8-log-httpd\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.737448 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777092 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10ac9a17-d069-484c-9f44-baaada4618f8-run-httpd\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777148 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2fjj\" (UniqueName: \"kubernetes.io/projected/10ac9a17-d069-484c-9f44-baaada4618f8-kube-api-access-w2fjj\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777177 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-config-data\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777227 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-combined-ca-bundle\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777260 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10ac9a17-d069-484c-9f44-baaada4618f8-log-httpd\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777296 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-public-tls-certs\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777328 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-internal-tls-certs\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.777371 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/10ac9a17-d069-484c-9f44-baaada4618f8-etc-swift\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.778889 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10ac9a17-d069-484c-9f44-baaada4618f8-run-httpd\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.784374 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10ac9a17-d069-484c-9f44-baaada4618f8-log-httpd\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.789744 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/10ac9a17-d069-484c-9f44-baaada4618f8-etc-swift\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.790444 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-public-tls-certs\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.792882 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-internal-tls-certs\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.799159 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-combined-ca-bundle\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.803956 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10ac9a17-d069-484c-9f44-baaada4618f8-config-data\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:41 crc kubenswrapper[4676]: I1204 15:40:41.807545 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2fjj\" (UniqueName: \"kubernetes.io/projected/10ac9a17-d069-484c-9f44-baaada4618f8-kube-api-access-w2fjj\") pod \"swift-proxy-78ffb7b6cf-46b4r\" (UID: \"10ac9a17-d069-484c-9f44-baaada4618f8\") " pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:42 crc kubenswrapper[4676]: I1204 15:40:42.030013 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:42 crc kubenswrapper[4676]: I1204 15:40:42.572049 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-log" containerID="cri-o://04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5" gracePeriod=30 Dec 04 15:40:42 crc kubenswrapper[4676]: I1204 15:40:42.572130 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-log" containerID="cri-o://21f7ad9ee346535f624fdcaa2ab2c6340d5e35c4bac7ec1499e6d404a31b6393" gracePeriod=30 Dec 04 15:40:42 crc kubenswrapper[4676]: I1204 15:40:42.572304 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-httpd" containerID="cri-o://57994ab52e6e547774d3808d63d8d475e398cd9385aca9de950aa6a2008e9809" gracePeriod=30 Dec 04 15:40:42 crc kubenswrapper[4676]: I1204 15:40:42.572291 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-httpd" containerID="cri-o://23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c" gracePeriod=30 Dec 04 15:40:42 crc kubenswrapper[4676]: I1204 15:40:42.680770 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68bd568fd5-srw6v"] Dec 04 15:40:42 crc kubenswrapper[4676]: W1204 15:40:42.712980 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eab48dd_24f7_4439_bcc1_29f34b005bda.slice/crio-90dfa8e91fc01c1892c6f9def5dddb3225aa41287db969f2ccd1b7cb9183fb8b WatchSource:0}: Error finding container 90dfa8e91fc01c1892c6f9def5dddb3225aa41287db969f2ccd1b7cb9183fb8b: Status 404 returned error can't find the container with id 90dfa8e91fc01c1892c6f9def5dddb3225aa41287db969f2ccd1b7cb9183fb8b Dec 04 15:40:42 crc kubenswrapper[4676]: I1204 15:40:42.953772 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-78ffb7b6cf-46b4r"] Dec 04 15:40:42 crc kubenswrapper[4676]: W1204 15:40:42.994794 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10ac9a17_d069_484c_9f44_baaada4618f8.slice/crio-d76ae8671eb79acc94fd8938b2feb5a9d0a5ee5eabb8c936e92cde454afe4569 WatchSource:0}: Error finding container d76ae8671eb79acc94fd8938b2feb5a9d0a5ee5eabb8c936e92cde454afe4569: Status 404 returned error can't find the container with id d76ae8671eb79acc94fd8938b2feb5a9d0a5ee5eabb8c936e92cde454afe4569 Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.403752 4676 scope.go:117] "RemoveContainer" containerID="3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.505651 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.573738 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-scripts\") pod \"a4680333-6827-4a80-ab35-c031c5cc4272\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.573804 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-httpd-run\") pod \"a4680333-6827-4a80-ab35-c031c5cc4272\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.573843 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-combined-ca-bundle\") pod \"a4680333-6827-4a80-ab35-c031c5cc4272\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.573890 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"a4680333-6827-4a80-ab35-c031c5cc4272\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.574734 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-config-data\") pod \"a4680333-6827-4a80-ab35-c031c5cc4272\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.574850 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgwwp\" (UniqueName: \"kubernetes.io/projected/a4680333-6827-4a80-ab35-c031c5cc4272-kube-api-access-fgwwp\") pod \"a4680333-6827-4a80-ab35-c031c5cc4272\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.574890 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-logs\") pod \"a4680333-6827-4a80-ab35-c031c5cc4272\" (UID: \"a4680333-6827-4a80-ab35-c031c5cc4272\") " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.578966 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-logs" (OuterVolumeSpecName: "logs") pod "a4680333-6827-4a80-ab35-c031c5cc4272" (UID: "a4680333-6827-4a80-ab35-c031c5cc4272"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.585113 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a4680333-6827-4a80-ab35-c031c5cc4272" (UID: "a4680333-6827-4a80-ab35-c031c5cc4272"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.618235 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-scripts" (OuterVolumeSpecName: "scripts") pod "a4680333-6827-4a80-ab35-c031c5cc4272" (UID: "a4680333-6827-4a80-ab35-c031c5cc4272"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.618393 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "a4680333-6827-4a80-ab35-c031c5cc4272" (UID: "a4680333-6827-4a80-ab35-c031c5cc4272"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.652572 4676 generic.go:334] "Generic (PLEG): container finished" podID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerID="57994ab52e6e547774d3808d63d8d475e398cd9385aca9de950aa6a2008e9809" exitCode=0 Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.652630 4676 generic.go:334] "Generic (PLEG): container finished" podID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerID="21f7ad9ee346535f624fdcaa2ab2c6340d5e35c4bac7ec1499e6d404a31b6393" exitCode=143 Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.652705 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b","Type":"ContainerDied","Data":"57994ab52e6e547774d3808d63d8d475e398cd9385aca9de950aa6a2008e9809"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.652740 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b","Type":"ContainerDied","Data":"21f7ad9ee346535f624fdcaa2ab2c6340d5e35c4bac7ec1499e6d404a31b6393"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.662225 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4680333-6827-4a80-ab35-c031c5cc4272-kube-api-access-fgwwp" (OuterVolumeSpecName: "kube-api-access-fgwwp") pod "a4680333-6827-4a80-ab35-c031c5cc4272" (UID: "a4680333-6827-4a80-ab35-c031c5cc4272"). InnerVolumeSpecName "kube-api-access-fgwwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.718608 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.719204 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.790508 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.790553 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4680333-6827-4a80-ab35-c031c5cc4272-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.790566 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgwwp\" (UniqueName: \"kubernetes.io/projected/a4680333-6827-4a80-ab35-c031c5cc4272-kube-api-access-fgwwp\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.818554 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68bd568fd5-srw6v" event={"ID":"5eab48dd-24f7-4439-bcc1-29f34b005bda","Type":"ContainerStarted","Data":"65104f3acaace36dc8ff905dc6ad003335bff412ad948bc683145af869e2f580"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.818952 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68bd568fd5-srw6v" event={"ID":"5eab48dd-24f7-4439-bcc1-29f34b005bda","Type":"ContainerStarted","Data":"791cfb106669e1bf405c7b38a32643f9c3efeb970ef6b1594b50f95ad3abd4d6"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.818967 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68bd568fd5-srw6v" event={"ID":"5eab48dd-24f7-4439-bcc1-29f34b005bda","Type":"ContainerStarted","Data":"90dfa8e91fc01c1892c6f9def5dddb3225aa41287db969f2ccd1b7cb9183fb8b"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.819209 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.830070 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" event={"ID":"10ac9a17-d069-484c-9f44-baaada4618f8","Type":"ContainerStarted","Data":"926a8788f8a9ad1eee80828b02a5b9e28b558d306a1d823ef713791226c2011b"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.830124 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" event={"ID":"10ac9a17-d069-484c-9f44-baaada4618f8","Type":"ContainerStarted","Data":"d76ae8671eb79acc94fd8938b2feb5a9d0a5ee5eabb8c936e92cde454afe4569"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.832945 4676 generic.go:334] "Generic (PLEG): container finished" podID="a4680333-6827-4a80-ab35-c031c5cc4272" containerID="23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c" exitCode=0 Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.832988 4676 generic.go:334] "Generic (PLEG): container finished" podID="a4680333-6827-4a80-ab35-c031c5cc4272" containerID="04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5" exitCode=143 Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.833029 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4680333-6827-4a80-ab35-c031c5cc4272","Type":"ContainerDied","Data":"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.833062 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4680333-6827-4a80-ab35-c031c5cc4272","Type":"ContainerDied","Data":"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.833085 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a4680333-6827-4a80-ab35-c031c5cc4272","Type":"ContainerDied","Data":"61cadc3ad9a73012d27bf8b099badf9231dc5f2e37584b8800af57fbd3f46379"} Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.833117 4676 scope.go:117] "RemoveContainer" containerID="23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.833384 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4680333-6827-4a80-ab35-c031c5cc4272" (UID: "a4680333-6827-4a80-ab35-c031c5cc4272"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.833451 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.841586 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-config-data" (OuterVolumeSpecName: "config-data") pod "a4680333-6827-4a80-ab35-c031c5cc4272" (UID: "a4680333-6827-4a80-ab35-c031c5cc4272"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.864377 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.873575 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68bd568fd5-srw6v" podStartSLOduration=2.873542214 podStartE2EDuration="2.873542214s" podCreationTimestamp="2025-12-04 15:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:43.86482488 +0000 UTC m=+1251.299494737" watchObservedRunningTime="2025-12-04 15:40:43.873542214 +0000 UTC m=+1251.308212091" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.893689 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.893721 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4680333-6827-4a80-ab35-c031c5cc4272-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.893732 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:43 crc kubenswrapper[4676]: I1204 15:40:43.962159 4676 scope.go:117] "RemoveContainer" containerID="04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.018488 4676 scope.go:117] "RemoveContainer" containerID="23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c" Dec 04 15:40:44 crc kubenswrapper[4676]: E1204 15:40:44.019082 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c\": container with ID starting with 23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c not found: ID does not exist" containerID="23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.019116 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c"} err="failed to get container status \"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c\": rpc error: code = NotFound desc = could not find container \"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c\": container with ID starting with 23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c not found: ID does not exist" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.019140 4676 scope.go:117] "RemoveContainer" containerID="04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5" Dec 04 15:40:44 crc kubenswrapper[4676]: E1204 15:40:44.019563 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5\": container with ID starting with 04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5 not found: ID does not exist" containerID="04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.019599 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5"} err="failed to get container status \"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5\": rpc error: code = NotFound desc = could not find container \"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5\": container with ID starting with 04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5 not found: ID does not exist" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.019624 4676 scope.go:117] "RemoveContainer" containerID="23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.019840 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c"} err="failed to get container status \"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c\": rpc error: code = NotFound desc = could not find container \"23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c\": container with ID starting with 23b1b9c69f3b70eded6e2af722e4c5bae96b3346e38291fce277d3d03299692c not found: ID does not exist" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.019855 4676 scope.go:117] "RemoveContainer" containerID="04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.020157 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5"} err="failed to get container status \"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5\": rpc error: code = NotFound desc = could not find container \"04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5\": container with ID starting with 04776eeeb68fb24deb074fa47e96b0f658018514488e9cec5306348107db84d5 not found: ID does not exist" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.205510 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.215687 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.234077 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.270996 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:44 crc kubenswrapper[4676]: E1204 15:40:44.271576 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-httpd" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.271603 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-httpd" Dec 04 15:40:44 crc kubenswrapper[4676]: E1204 15:40:44.271643 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-log" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.271652 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-log" Dec 04 15:40:44 crc kubenswrapper[4676]: E1204 15:40:44.271665 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-log" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.271673 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-log" Dec 04 15:40:44 crc kubenswrapper[4676]: E1204 15:40:44.271692 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-httpd" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.271704 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-httpd" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.271998 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-httpd" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.272027 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-httpd" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.272047 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" containerName="glance-log" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.272076 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" containerName="glance-log" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.273509 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.277485 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.277852 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.280597 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.315639 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54wd8\" (UniqueName: \"kubernetes.io/projected/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-kube-api-access-54wd8\") pod \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.315694 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-combined-ca-bundle\") pod \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.315720 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-scripts\") pod \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.315737 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.315767 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-config-data\") pod \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.315879 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-httpd-run\") pod \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.315939 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-logs\") pod \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\" (UID: \"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b\") " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.317192 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" (UID: "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.317313 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-logs" (OuterVolumeSpecName: "logs") pod "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" (UID: "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.342511 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-scripts" (OuterVolumeSpecName: "scripts") pod "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" (UID: "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.343134 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" (UID: "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.343425 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-kube-api-access-54wd8" (OuterVolumeSpecName: "kube-api-access-54wd8") pod "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" (UID: "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b"). InnerVolumeSpecName "kube-api-access-54wd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.392235 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" (UID: "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.415520 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-config-data" (OuterVolumeSpecName: "config-data") pod "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" (UID: "8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.417701 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.417794 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.417844 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpxxv\" (UniqueName: \"kubernetes.io/projected/f915ebe5-d216-4de0-ad9e-506664c6e27f-kube-api-access-fpxxv\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.417863 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.417888 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.417971 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-logs\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418121 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418294 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418626 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418652 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54wd8\" (UniqueName: \"kubernetes.io/projected/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-kube-api-access-54wd8\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418666 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418678 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418705 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418755 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.418769 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.442839 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.646284 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.646367 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpxxv\" (UniqueName: \"kubernetes.io/projected/f915ebe5-d216-4de0-ad9e-506664c6e27f-kube-api-access-fpxxv\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.646388 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.646428 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.646524 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-logs\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.646541 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.648480 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.648550 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.648861 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.649273 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.649735 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-logs\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.649857 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.670185 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-scripts\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.670612 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpxxv\" (UniqueName: \"kubernetes.io/projected/f915ebe5-d216-4de0-ad9e-506664c6e27f-kube-api-access-fpxxv\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.670775 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.672008 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-config-data\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.676648 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.732289 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.873588 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b","Type":"ContainerDied","Data":"186b529dbf1f0c2032a068c742acee85a727a5829b611f582e42f63ed958506f"} Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.873646 4676 scope.go:117] "RemoveContainer" containerID="57994ab52e6e547774d3808d63d8d475e398cd9385aca9de950aa6a2008e9809" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.873840 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.886091 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerStarted","Data":"06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08"} Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.892567 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.893875 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" event={"ID":"10ac9a17-d069-484c-9f44-baaada4618f8","Type":"ContainerStarted","Data":"79d72835875133ce0e0d646fdeb454d41ca1cf7ce67684f074268b0e770d2e04"} Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.894670 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.894707 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.966814 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" podStartSLOduration=3.966799027 podStartE2EDuration="3.966799027s" podCreationTimestamp="2025-12-04 15:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:44.93197004 +0000 UTC m=+1252.366639917" watchObservedRunningTime="2025-12-04 15:40:44.966799027 +0000 UTC m=+1252.401468884" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.975546 4676 scope.go:117] "RemoveContainer" containerID="21f7ad9ee346535f624fdcaa2ab2c6340d5e35c4bac7ec1499e6d404a31b6393" Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.985250 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:44 crc kubenswrapper[4676]: I1204 15:40:44.996044 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.047230 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.053036 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.056057 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.070539 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.079659 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.119093 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.255096 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.255467 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.256735 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.258301 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.258394 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-logs\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.258505 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.258715 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.258825 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmjd\" (UniqueName: \"kubernetes.io/projected/e381383e-d565-4243-92d1-d9ea82e7cad8-kube-api-access-gxmjd\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363256 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363554 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363608 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-logs\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363685 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363721 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363779 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmjd\" (UniqueName: \"kubernetes.io/projected/e381383e-d565-4243-92d1-d9ea82e7cad8-kube-api-access-gxmjd\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363874 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.363930 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.365360 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.365895 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-logs\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.366050 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.374150 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.384100 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.386251 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.392880 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.394540 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmjd\" (UniqueName: \"kubernetes.io/projected/e381383e-d565-4243-92d1-d9ea82e7cad8-kube-api-access-gxmjd\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.404546 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.448829 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b" path="/var/lib/kubelet/pods/8334ac7f-0cf6-494f-ad3e-e8b3d724ea4b/volumes" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.454212 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4680333-6827-4a80-ab35-c031c5cc4272" path="/var/lib/kubelet/pods/a4680333-6827-4a80-ab35-c031c5cc4272/volumes" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.610095 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.877308 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.976224 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9d66887-9f4ws"] Dec 04 15:40:45 crc kubenswrapper[4676]: I1204 15:40:45.976516 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" podUID="5d8c669b-28cb-4230-9425-671d7d330d89" containerName="dnsmasq-dns" containerID="cri-o://e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f" gracePeriod=10 Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.029458 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.029519 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.068069 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.434852 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.883814 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.954925 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f915ebe5-d216-4de0-ad9e-506664c6e27f","Type":"ContainerStarted","Data":"9eac7c45e585bc06a7a3dfa6bfc8df5b9410801192922c20099e820ac4057ac4"} Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.957770 4676 generic.go:334] "Generic (PLEG): container finished" podID="5d8c669b-28cb-4230-9425-671d7d330d89" containerID="e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f" exitCode=0 Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.957832 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" event={"ID":"5d8c669b-28cb-4230-9425-671d7d330d89","Type":"ContainerDied","Data":"e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f"} Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.957852 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" event={"ID":"5d8c669b-28cb-4230-9425-671d7d330d89","Type":"ContainerDied","Data":"1490b9298684c99c84f5a1f31efbb85bbbc4850f1fef4aa3a5954e0a802baa2a"} Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.957869 4676 scope.go:117] "RemoveContainer" containerID="e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f" Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.957992 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b9d66887-9f4ws" Dec 04 15:40:46 crc kubenswrapper[4676]: I1204 15:40:46.963198 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e381383e-d565-4243-92d1-d9ea82e7cad8","Type":"ContainerStarted","Data":"6aef207d04c69f32747efd74d6b3d10a63dbfb463f112e6a615cc970af7e59a3"} Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:46.999890 4676 scope.go:117] "RemoveContainer" containerID="a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.057622 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8mz5\" (UniqueName: \"kubernetes.io/projected/5d8c669b-28cb-4230-9425-671d7d330d89-kube-api-access-f8mz5\") pod \"5d8c669b-28cb-4230-9425-671d7d330d89\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.057749 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-svc\") pod \"5d8c669b-28cb-4230-9425-671d7d330d89\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.057795 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-nb\") pod \"5d8c669b-28cb-4230-9425-671d7d330d89\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.057833 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-sb\") pod \"5d8c669b-28cb-4230-9425-671d7d330d89\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.057862 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-swift-storage-0\") pod \"5d8c669b-28cb-4230-9425-671d7d330d89\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.058055 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-config\") pod \"5d8c669b-28cb-4230-9425-671d7d330d89\" (UID: \"5d8c669b-28cb-4230-9425-671d7d330d89\") " Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.069982 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8c669b-28cb-4230-9425-671d7d330d89-kube-api-access-f8mz5" (OuterVolumeSpecName: "kube-api-access-f8mz5") pod "5d8c669b-28cb-4230-9425-671d7d330d89" (UID: "5d8c669b-28cb-4230-9425-671d7d330d89"). InnerVolumeSpecName "kube-api-access-f8mz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.076092 4676 scope.go:117] "RemoveContainer" containerID="e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f" Dec 04 15:40:47 crc kubenswrapper[4676]: E1204 15:40:47.080011 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f\": container with ID starting with e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f not found: ID does not exist" containerID="e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.080050 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f"} err="failed to get container status \"e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f\": rpc error: code = NotFound desc = could not find container \"e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f\": container with ID starting with e31589db5549597e443e1565d1fa5f51658674507343330d523d8cced785e76f not found: ID does not exist" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.080094 4676 scope.go:117] "RemoveContainer" containerID="a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4" Dec 04 15:40:47 crc kubenswrapper[4676]: E1204 15:40:47.081209 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4\": container with ID starting with a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4 not found: ID does not exist" containerID="a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.081231 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4"} err="failed to get container status \"a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4\": rpc error: code = NotFound desc = could not find container \"a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4\": container with ID starting with a534b734329402a7e6c56e47f48bdf83038018b49ca3e1de684c032e521defc4 not found: ID does not exist" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.126305 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-config" (OuterVolumeSpecName: "config") pod "5d8c669b-28cb-4230-9425-671d7d330d89" (UID: "5d8c669b-28cb-4230-9425-671d7d330d89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.157707 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d8c669b-28cb-4230-9425-671d7d330d89" (UID: "5d8c669b-28cb-4230-9425-671d7d330d89"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.159100 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5d8c669b-28cb-4230-9425-671d7d330d89" (UID: "5d8c669b-28cb-4230-9425-671d7d330d89"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.160883 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d8c669b-28cb-4230-9425-671d7d330d89" (UID: "5d8c669b-28cb-4230-9425-671d7d330d89"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.161172 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.161195 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.161211 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.161225 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.161235 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8mz5\" (UniqueName: \"kubernetes.io/projected/5d8c669b-28cb-4230-9425-671d7d330d89-kube-api-access-f8mz5\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.170789 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5d8c669b-28cb-4230-9425-671d7d330d89" (UID: "5d8c669b-28cb-4230-9425-671d7d330d89"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.262254 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d8c669b-28cb-4230-9425-671d7d330d89-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.319974 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b9d66887-9f4ws"] Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.327512 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b9d66887-9f4ws"] Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.408129 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8c669b-28cb-4230-9425-671d7d330d89" path="/var/lib/kubelet/pods/5d8c669b-28cb-4230-9425-671d7d330d89/volumes" Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.975149 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f915ebe5-d216-4de0-ad9e-506664c6e27f","Type":"ContainerStarted","Data":"cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5"} Dec 04 15:40:47 crc kubenswrapper[4676]: I1204 15:40:47.979499 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e381383e-d565-4243-92d1-d9ea82e7cad8","Type":"ContainerStarted","Data":"f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2"} Dec 04 15:40:48 crc kubenswrapper[4676]: I1204 15:40:48.942992 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:40:48 crc kubenswrapper[4676]: I1204 15:40:48.993466 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f915ebe5-d216-4de0-ad9e-506664c6e27f","Type":"ContainerStarted","Data":"c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d"} Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.000340 4676 generic.go:334] "Generic (PLEG): container finished" podID="6cfbf976-db77-44d0-9a80-83648d806eea" containerID="0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f" exitCode=137 Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.000403 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerDied","Data":"0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f"} Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.000433 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6cfbf976-db77-44d0-9a80-83648d806eea","Type":"ContainerDied","Data":"32e5f16028fc90f5de5f4035e8874d01204e1ccd47f0c4d902d36022562e20e7"} Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.000452 4676 scope.go:117] "RemoveContainer" containerID="0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.000607 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.004982 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e381383e-d565-4243-92d1-d9ea82e7cad8","Type":"ContainerStarted","Data":"6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16"} Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.029428 4676 scope.go:117] "RemoveContainer" containerID="57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.054136 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.054113626 podStartE2EDuration="5.054113626s" podCreationTimestamp="2025-12-04 15:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:49.050186682 +0000 UTC m=+1256.484856559" watchObservedRunningTime="2025-12-04 15:40:49.054113626 +0000 UTC m=+1256.488783493" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.062786 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.062755379 podStartE2EDuration="5.062755379s" podCreationTimestamp="2025-12-04 15:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:40:49.021275487 +0000 UTC m=+1256.455945335" watchObservedRunningTime="2025-12-04 15:40:49.062755379 +0000 UTC m=+1256.497425256" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.063686 4676 scope.go:117] "RemoveContainer" containerID="79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.103402 4676 scope.go:117] "RemoveContainer" containerID="0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.107334 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-run-httpd\") pod \"6cfbf976-db77-44d0-9a80-83648d806eea\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.107445 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rxsr\" (UniqueName: \"kubernetes.io/projected/6cfbf976-db77-44d0-9a80-83648d806eea-kube-api-access-9rxsr\") pod \"6cfbf976-db77-44d0-9a80-83648d806eea\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.107519 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-scripts\") pod \"6cfbf976-db77-44d0-9a80-83648d806eea\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.107620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-sg-core-conf-yaml\") pod \"6cfbf976-db77-44d0-9a80-83648d806eea\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.107672 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-config-data\") pod \"6cfbf976-db77-44d0-9a80-83648d806eea\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.107714 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-log-httpd\") pod \"6cfbf976-db77-44d0-9a80-83648d806eea\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.107743 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-combined-ca-bundle\") pod \"6cfbf976-db77-44d0-9a80-83648d806eea\" (UID: \"6cfbf976-db77-44d0-9a80-83648d806eea\") " Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.109199 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6cfbf976-db77-44d0-9a80-83648d806eea" (UID: "6cfbf976-db77-44d0-9a80-83648d806eea"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.109713 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6cfbf976-db77-44d0-9a80-83648d806eea" (UID: "6cfbf976-db77-44d0-9a80-83648d806eea"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.111424 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f\": container with ID starting with 0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f not found: ID does not exist" containerID="0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.111475 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f"} err="failed to get container status \"0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f\": rpc error: code = NotFound desc = could not find container \"0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f\": container with ID starting with 0528a7252b930a53c406b81a1df3f6884987b4b1fe56f62992048954ca283e2f not found: ID does not exist" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.111499 4676 scope.go:117] "RemoveContainer" containerID="57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.117132 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cfbf976-db77-44d0-9a80-83648d806eea-kube-api-access-9rxsr" (OuterVolumeSpecName: "kube-api-access-9rxsr") pod "6cfbf976-db77-44d0-9a80-83648d806eea" (UID: "6cfbf976-db77-44d0-9a80-83648d806eea"). InnerVolumeSpecName "kube-api-access-9rxsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.117241 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-scripts" (OuterVolumeSpecName: "scripts") pod "6cfbf976-db77-44d0-9a80-83648d806eea" (UID: "6cfbf976-db77-44d0-9a80-83648d806eea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.117314 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f\": container with ID starting with 57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f not found: ID does not exist" containerID="57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.117359 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f"} err="failed to get container status \"57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f\": rpc error: code = NotFound desc = could not find container \"57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f\": container with ID starting with 57dbe413f34d67b3f88bc52d9e784945849282846d54081600ec07a9cd787f0f not found: ID does not exist" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.117386 4676 scope.go:117] "RemoveContainer" containerID="79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860" Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.117676 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860\": container with ID starting with 79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860 not found: ID does not exist" containerID="79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.117699 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860"} err="failed to get container status \"79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860\": rpc error: code = NotFound desc = could not find container \"79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860\": container with ID starting with 79fb8322b359d466e6c6c027ed0b8fa9abf27e6d199efaed02493b6afb2b8860 not found: ID does not exist" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.144604 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6cfbf976-db77-44d0-9a80-83648d806eea" (UID: "6cfbf976-db77-44d0-9a80-83648d806eea"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.166423 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cfbf976-db77-44d0-9a80-83648d806eea" (UID: "6cfbf976-db77-44d0-9a80-83648d806eea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.209982 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.210011 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.210020 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.210028 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6cfbf976-db77-44d0-9a80-83648d806eea-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.210037 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rxsr\" (UniqueName: \"kubernetes.io/projected/6cfbf976-db77-44d0-9a80-83648d806eea-kube-api-access-9rxsr\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.210047 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.214277 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-config-data" (OuterVolumeSpecName: "config-data") pod "6cfbf976-db77-44d0-9a80-83648d806eea" (UID: "6cfbf976-db77-44d0-9a80-83648d806eea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.311789 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cfbf976-db77-44d0-9a80-83648d806eea-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.407288 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.532976 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.554483 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.556766 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="sg-core" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.556789 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="sg-core" Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.556806 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8c669b-28cb-4230-9425-671d7d330d89" containerName="dnsmasq-dns" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.556983 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8c669b-28cb-4230-9425-671d7d330d89" containerName="dnsmasq-dns" Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.557002 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8c669b-28cb-4230-9425-671d7d330d89" containerName="init" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.557051 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8c669b-28cb-4230-9425-671d7d330d89" containerName="init" Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.557114 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="ceilometer-notification-agent" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.557123 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="ceilometer-notification-agent" Dec 04 15:40:49 crc kubenswrapper[4676]: E1204 15:40:49.557193 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="proxy-httpd" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.557201 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="proxy-httpd" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.557861 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="proxy-httpd" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.557932 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="sg-core" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.557944 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8c669b-28cb-4230-9425-671d7d330d89" containerName="dnsmasq-dns" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.557954 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" containerName="ceilometer-notification-agent" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.561857 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.567020 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.567206 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.579444 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.712800 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-config-data\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.713000 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drt8v\" (UniqueName: \"kubernetes.io/projected/2b6ad50b-8581-4079-9da8-1115ec1316f2-kube-api-access-drt8v\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.713087 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-scripts\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.713152 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.713233 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.713332 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-log-httpd\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.713442 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-run-httpd\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.814681 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drt8v\" (UniqueName: \"kubernetes.io/projected/2b6ad50b-8581-4079-9da8-1115ec1316f2-kube-api-access-drt8v\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.814759 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-scripts\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.814799 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.814824 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.814883 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-log-httpd\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.814965 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-run-httpd\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.815002 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-config-data\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.815746 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-log-httpd\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.815927 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-run-httpd\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.819135 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-scripts\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.819272 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.820292 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-config-data\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.829464 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.833956 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drt8v\" (UniqueName: \"kubernetes.io/projected/2b6ad50b-8581-4079-9da8-1115ec1316f2-kube-api-access-drt8v\") pod \"ceilometer-0\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " pod="openstack/ceilometer-0" Dec 04 15:40:49 crc kubenswrapper[4676]: I1204 15:40:49.887667 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:40:50 crc kubenswrapper[4676]: I1204 15:40:50.033509 4676 generic.go:334] "Generic (PLEG): container finished" podID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" exitCode=1 Dec 04 15:40:50 crc kubenswrapper[4676]: I1204 15:40:50.033571 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerDied","Data":"06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08"} Dec 04 15:40:50 crc kubenswrapper[4676]: I1204 15:40:50.034153 4676 scope.go:117] "RemoveContainer" containerID="3f2b62329be6489cc63257f8a5b22d331c844d3d597c198df35a9817ac93f710" Dec 04 15:40:50 crc kubenswrapper[4676]: I1204 15:40:50.035158 4676 scope.go:117] "RemoveContainer" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:40:50 crc kubenswrapper[4676]: E1204 15:40:50.035562 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:40:50 crc kubenswrapper[4676]: I1204 15:40:50.583704 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:40:50 crc kubenswrapper[4676]: I1204 15:40:50.902480 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:40:51 crc kubenswrapper[4676]: I1204 15:40:51.050940 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerStarted","Data":"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d"} Dec 04 15:40:51 crc kubenswrapper[4676]: I1204 15:40:51.051264 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerStarted","Data":"2853fd213160c33e651bfdf8b69050a169117e7305ccfa5107746819b931f0cc"} Dec 04 15:40:51 crc kubenswrapper[4676]: I1204 15:40:51.399075 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cfbf976-db77-44d0-9a80-83648d806eea" path="/var/lib/kubelet/pods/6cfbf976-db77-44d0-9a80-83648d806eea/volumes" Dec 04 15:40:52 crc kubenswrapper[4676]: I1204 15:40:52.046520 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:52 crc kubenswrapper[4676]: I1204 15:40:52.048196 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" Dec 04 15:40:52 crc kubenswrapper[4676]: I1204 15:40:52.900674 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:52 crc kubenswrapper[4676]: I1204 15:40:52.900735 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:40:52 crc kubenswrapper[4676]: I1204 15:40:52.901426 4676 scope.go:117] "RemoveContainer" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:40:52 crc kubenswrapper[4676]: E1204 15:40:52.901687 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:40:54 crc kubenswrapper[4676]: I1204 15:40:54.893357 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 15:40:54 crc kubenswrapper[4676]: I1204 15:40:54.893808 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 15:40:54 crc kubenswrapper[4676]: I1204 15:40:54.940572 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 15:40:54 crc kubenswrapper[4676]: I1204 15:40:54.953939 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 15:40:55 crc kubenswrapper[4676]: I1204 15:40:55.114886 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 15:40:55 crc kubenswrapper[4676]: I1204 15:40:55.115412 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 15:40:55 crc kubenswrapper[4676]: I1204 15:40:55.612213 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:55 crc kubenswrapper[4676]: I1204 15:40:55.612528 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:55 crc kubenswrapper[4676]: I1204 15:40:55.643488 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:55 crc kubenswrapper[4676]: I1204 15:40:55.685384 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:56 crc kubenswrapper[4676]: I1204 15:40:56.125821 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:56 crc kubenswrapper[4676]: I1204 15:40:56.125872 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 15:40:56 crc kubenswrapper[4676]: I1204 15:40:56.461487 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:40:57 crc kubenswrapper[4676]: I1204 15:40:57.139062 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:57 crc kubenswrapper[4676]: I1204 15:40:57.139091 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:57 crc kubenswrapper[4676]: I1204 15:40:57.549708 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 15:40:57 crc kubenswrapper[4676]: I1204 15:40:57.585122 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 15:40:57 crc kubenswrapper[4676]: I1204 15:40:57.992140 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.151102 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.151912 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.151641 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-log" containerID="cri-o://cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5" gracePeriod=30 Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.151674 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-httpd" containerID="cri-o://c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d" gracePeriod=30 Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.152087 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-httpd" containerID="cri-o://6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16" gracePeriod=30 Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.152150 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-log" containerID="cri-o://f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2" gracePeriod=30 Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.163430 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.185:9292/healthcheck\": EOF" Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.163452 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.185:9292/healthcheck\": EOF" Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.170191 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.186:9292/healthcheck\": EOF" Dec 04 15:40:58 crc kubenswrapper[4676]: I1204 15:40:58.170746 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.186:9292/healthcheck\": EOF" Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.165994 4676 generic.go:334] "Generic (PLEG): container finished" podID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerID="f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2" exitCode=143 Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.166290 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e381383e-d565-4243-92d1-d9ea82e7cad8","Type":"ContainerDied","Data":"f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2"} Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.172876 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da921c96-bdd0-4aa2-a98e-9adc22788b75","Type":"ContainerStarted","Data":"9e661cbe7a74c929aca8ee8e3bc940f18534e92b463018cad6aa4efb57a313cd"} Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.182327 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerStarted","Data":"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f"} Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.182384 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerStarted","Data":"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254"} Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.186190 4676 generic.go:334] "Generic (PLEG): container finished" podID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerID="cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5" exitCode=143 Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.186234 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f915ebe5-d216-4de0-ad9e-506664c6e27f","Type":"ContainerDied","Data":"cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5"} Dec 04 15:40:59 crc kubenswrapper[4676]: I1204 15:40:59.199487 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.211306715 podStartE2EDuration="26.199462601s" podCreationTimestamp="2025-12-04 15:40:33 +0000 UTC" firstStartedPulling="2025-12-04 15:40:34.211962965 +0000 UTC m=+1241.646632822" lastFinishedPulling="2025-12-04 15:40:58.200118851 +0000 UTC m=+1265.634788708" observedRunningTime="2025-12-04 15:40:59.198620776 +0000 UTC m=+1266.633290633" watchObservedRunningTime="2025-12-04 15:40:59.199462601 +0000 UTC m=+1266.634132458" Dec 04 15:41:00 crc kubenswrapper[4676]: I1204 15:41:00.413856 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.186:9292/healthcheck\": read tcp 10.217.0.2:42830->10.217.0.186:9292: read: connection reset by peer" Dec 04 15:41:00 crc kubenswrapper[4676]: I1204 15:41:00.413887 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.186:9292/healthcheck\": read tcp 10.217.0.2:42834->10.217.0.186:9292: read: connection reset by peer" Dec 04 15:41:00 crc kubenswrapper[4676]: I1204 15:41:00.990741 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.155532 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.155694 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxmjd\" (UniqueName: \"kubernetes.io/projected/e381383e-d565-4243-92d1-d9ea82e7cad8-kube-api-access-gxmjd\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.155743 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-combined-ca-bundle\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.155827 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-logs\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.155956 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-config-data\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.156020 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-internal-tls-certs\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.156080 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-scripts\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.156143 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-httpd-run\") pod \"e381383e-d565-4243-92d1-d9ea82e7cad8\" (UID: \"e381383e-d565-4243-92d1-d9ea82e7cad8\") " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.156382 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-logs" (OuterVolumeSpecName: "logs") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.156623 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.156956 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.156973 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e381383e-d565-4243-92d1-d9ea82e7cad8-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.165280 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e381383e-d565-4243-92d1-d9ea82e7cad8-kube-api-access-gxmjd" (OuterVolumeSpecName: "kube-api-access-gxmjd") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "kube-api-access-gxmjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.165424 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.174125 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-scripts" (OuterVolumeSpecName: "scripts") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.193123 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.214222 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerStarted","Data":"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a"} Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.214433 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-central-agent" containerID="cri-o://96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" gracePeriod=30 Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.214743 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.214778 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="proxy-httpd" containerID="cri-o://41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" gracePeriod=30 Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.214805 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="sg-core" containerID="cri-o://65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" gracePeriod=30 Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.214861 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-notification-agent" containerID="cri-o://3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" gracePeriod=30 Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.223463 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-config-data" (OuterVolumeSpecName: "config-data") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.228639 4676 generic.go:334] "Generic (PLEG): container finished" podID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerID="6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16" exitCode=0 Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.228697 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e381383e-d565-4243-92d1-d9ea82e7cad8","Type":"ContainerDied","Data":"6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16"} Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.228763 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e381383e-d565-4243-92d1-d9ea82e7cad8","Type":"ContainerDied","Data":"6aef207d04c69f32747efd74d6b3d10a63dbfb463f112e6a615cc970af7e59a3"} Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.228790 4676 scope.go:117] "RemoveContainer" containerID="6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.229005 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.239857 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.420559577 podStartE2EDuration="12.2398322s" podCreationTimestamp="2025-12-04 15:40:49 +0000 UTC" firstStartedPulling="2025-12-04 15:40:50.609148034 +0000 UTC m=+1258.043817891" lastFinishedPulling="2025-12-04 15:41:00.428420657 +0000 UTC m=+1267.863090514" observedRunningTime="2025-12-04 15:41:01.23914639 +0000 UTC m=+1268.673816247" watchObservedRunningTime="2025-12-04 15:41:01.2398322 +0000 UTC m=+1268.674502057" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.262524 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.262568 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.262581 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.262617 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.262632 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxmjd\" (UniqueName: \"kubernetes.io/projected/e381383e-d565-4243-92d1-d9ea82e7cad8-kube-api-access-gxmjd\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.280197 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e381383e-d565-4243-92d1-d9ea82e7cad8" (UID: "e381383e-d565-4243-92d1-d9ea82e7cad8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.284376 4676 scope.go:117] "RemoveContainer" containerID="f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.305987 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.316391 4676 scope.go:117] "RemoveContainer" containerID="6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16" Dec 04 15:41:01 crc kubenswrapper[4676]: E1204 15:41:01.317439 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16\": container with ID starting with 6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16 not found: ID does not exist" containerID="6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.317548 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16"} err="failed to get container status \"6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16\": rpc error: code = NotFound desc = could not find container \"6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16\": container with ID starting with 6902103553f63b785d4f8b4bc1554daf3f506465c24381191b0c5a8b7b1f9d16 not found: ID does not exist" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.317649 4676 scope.go:117] "RemoveContainer" containerID="f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2" Dec 04 15:41:01 crc kubenswrapper[4676]: E1204 15:41:01.323620 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2\": container with ID starting with f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2 not found: ID does not exist" containerID="f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.323798 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2"} err="failed to get container status \"f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2\": rpc error: code = NotFound desc = could not find container \"f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2\": container with ID starting with f3d291c04001bbeaba6ea1899b39fc4d156f995af479f87b88741ed032476ab2 not found: ID does not exist" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.364341 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.364382 4676 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e381383e-d565-4243-92d1-d9ea82e7cad8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.571005 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.581525 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.599668 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:41:01 crc kubenswrapper[4676]: E1204 15:41:01.600131 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-log" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.600155 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-log" Dec 04 15:41:01 crc kubenswrapper[4676]: E1204 15:41:01.600177 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-httpd" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.600184 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-httpd" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.600408 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-httpd" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.600435 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" containerName="glance-log" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.601670 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.607451 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.608087 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.615618 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786553 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786648 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786682 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98ccf1a8-b6c5-4f19-af89-531b204e79eb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786703 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786733 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98ccf1a8-b6c5-4f19-af89-531b204e79eb-logs\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786771 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786813 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g6kb\" (UniqueName: \"kubernetes.io/projected/98ccf1a8-b6c5-4f19-af89-531b204e79eb-kube-api-access-9g6kb\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.786850 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889025 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889125 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889264 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889334 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98ccf1a8-b6c5-4f19-af89-531b204e79eb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889373 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889437 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98ccf1a8-b6c5-4f19-af89-531b204e79eb-logs\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889524 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.889604 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g6kb\" (UniqueName: \"kubernetes.io/projected/98ccf1a8-b6c5-4f19-af89-531b204e79eb-kube-api-access-9g6kb\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.890599 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/98ccf1a8-b6c5-4f19-af89-531b204e79eb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.893140 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98ccf1a8-b6c5-4f19-af89-531b204e79eb-logs\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.893159 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.899603 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.910822 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.914225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.918163 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/98ccf1a8-b6c5-4f19-af89-531b204e79eb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.918776 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g6kb\" (UniqueName: \"kubernetes.io/projected/98ccf1a8-b6c5-4f19-af89-531b204e79eb-kube-api-access-9g6kb\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.945399 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"98ccf1a8-b6c5-4f19-af89-531b204e79eb\") " pod="openstack/glance-default-internal-api-0" Dec 04 15:41:01 crc kubenswrapper[4676]: I1204 15:41:01.955095 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.030600 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.180039 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.205970 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-combined-ca-bundle\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.206024 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-config-data\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.206148 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-public-tls-certs\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.206176 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.206227 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpxxv\" (UniqueName: \"kubernetes.io/projected/f915ebe5-d216-4de0-ad9e-506664c6e27f-kube-api-access-fpxxv\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.206279 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-logs\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.206406 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-httpd-run\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.206453 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-scripts\") pod \"f915ebe5-d216-4de0-ad9e-506664c6e27f\" (UID: \"f915ebe5-d216-4de0-ad9e-506664c6e27f\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.208063 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.208541 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-logs" (OuterVolumeSpecName: "logs") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.212549 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.220974 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-scripts" (OuterVolumeSpecName: "scripts") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.221193 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f915ebe5-d216-4de0-ad9e-506664c6e27f-kube-api-access-fpxxv" (OuterVolumeSpecName: "kube-api-access-fpxxv") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "kube-api-access-fpxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.242471 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerID="41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" exitCode=0 Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.242765 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerID="65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" exitCode=2 Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.243282 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerID="3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" exitCode=0 Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.243406 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerID="96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" exitCode=0 Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.242567 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.242534 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerDied","Data":"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a"} Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.243738 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerDied","Data":"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f"} Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.243765 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerDied","Data":"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254"} Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.243778 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerDied","Data":"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d"} Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.243791 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b6ad50b-8581-4079-9da8-1115ec1316f2","Type":"ContainerDied","Data":"2853fd213160c33e651bfdf8b69050a169117e7305ccfa5107746819b931f0cc"} Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.243811 4676 scope.go:117] "RemoveContainer" containerID="41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.247525 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.247809 4676 generic.go:334] "Generic (PLEG): container finished" podID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerID="c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d" exitCode=0 Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.247971 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f915ebe5-d216-4de0-ad9e-506664c6e27f","Type":"ContainerDied","Data":"c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d"} Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.248228 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f915ebe5-d216-4de0-ad9e-506664c6e27f","Type":"ContainerDied","Data":"9eac7c45e585bc06a7a3dfa6bfc8df5b9410801192922c20099e820ac4057ac4"} Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.248196 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.278529 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.281662 4676 scope.go:117] "RemoveContainer" containerID="65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.282424 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-config-data" (OuterVolumeSpecName: "config-data") pod "f915ebe5-d216-4de0-ad9e-506664c6e27f" (UID: "f915ebe5-d216-4de0-ad9e-506664c6e27f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.305151 4676 scope.go:117] "RemoveContainer" containerID="3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.308724 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-combined-ca-bundle\") pod \"2b6ad50b-8581-4079-9da8-1115ec1316f2\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.308866 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-run-httpd\") pod \"2b6ad50b-8581-4079-9da8-1115ec1316f2\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.308949 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drt8v\" (UniqueName: \"kubernetes.io/projected/2b6ad50b-8581-4079-9da8-1115ec1316f2-kube-api-access-drt8v\") pod \"2b6ad50b-8581-4079-9da8-1115ec1316f2\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.308979 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-scripts\") pod \"2b6ad50b-8581-4079-9da8-1115ec1316f2\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309010 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-log-httpd\") pod \"2b6ad50b-8581-4079-9da8-1115ec1316f2\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309066 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-config-data\") pod \"2b6ad50b-8581-4079-9da8-1115ec1316f2\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309227 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-sg-core-conf-yaml\") pod \"2b6ad50b-8581-4079-9da8-1115ec1316f2\" (UID: \"2b6ad50b-8581-4079-9da8-1115ec1316f2\") " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309727 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309742 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309754 4676 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309775 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309786 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpxxv\" (UniqueName: \"kubernetes.io/projected/f915ebe5-d216-4de0-ad9e-506664c6e27f-kube-api-access-fpxxv\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309796 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309805 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f915ebe5-d216-4de0-ad9e-506664c6e27f-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.309814 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f915ebe5-d216-4de0-ad9e-506664c6e27f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.310236 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2b6ad50b-8581-4079-9da8-1115ec1316f2" (UID: "2b6ad50b-8581-4079-9da8-1115ec1316f2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.310841 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2b6ad50b-8581-4079-9da8-1115ec1316f2" (UID: "2b6ad50b-8581-4079-9da8-1115ec1316f2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.315386 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6ad50b-8581-4079-9da8-1115ec1316f2-kube-api-access-drt8v" (OuterVolumeSpecName: "kube-api-access-drt8v") pod "2b6ad50b-8581-4079-9da8-1115ec1316f2" (UID: "2b6ad50b-8581-4079-9da8-1115ec1316f2"). InnerVolumeSpecName "kube-api-access-drt8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.317267 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-scripts" (OuterVolumeSpecName: "scripts") pod "2b6ad50b-8581-4079-9da8-1115ec1316f2" (UID: "2b6ad50b-8581-4079-9da8-1115ec1316f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.340700 4676 scope.go:117] "RemoveContainer" containerID="96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.364787 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2b6ad50b-8581-4079-9da8-1115ec1316f2" (UID: "2b6ad50b-8581-4079-9da8-1115ec1316f2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.365288 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.369292 4676 scope.go:117] "RemoveContainer" containerID="41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.370469 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": container with ID starting with 41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a not found: ID does not exist" containerID="41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.370694 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a"} err="failed to get container status \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": rpc error: code = NotFound desc = could not find container \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": container with ID starting with 41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.370767 4676 scope.go:117] "RemoveContainer" containerID="65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.371312 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": container with ID starting with 65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f not found: ID does not exist" containerID="65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.371371 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f"} err="failed to get container status \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": rpc error: code = NotFound desc = could not find container \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": container with ID starting with 65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.371404 4676 scope.go:117] "RemoveContainer" containerID="3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.375206 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": container with ID starting with 3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254 not found: ID does not exist" containerID="3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.375346 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254"} err="failed to get container status \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": rpc error: code = NotFound desc = could not find container \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": container with ID starting with 3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254 not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.375377 4676 scope.go:117] "RemoveContainer" containerID="96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.375883 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": container with ID starting with 96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d not found: ID does not exist" containerID="96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.375980 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d"} err="failed to get container status \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": rpc error: code = NotFound desc = could not find container \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": container with ID starting with 96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.376026 4676 scope.go:117] "RemoveContainer" containerID="41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.377780 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a"} err="failed to get container status \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": rpc error: code = NotFound desc = could not find container \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": container with ID starting with 41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.377837 4676 scope.go:117] "RemoveContainer" containerID="65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.379575 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f"} err="failed to get container status \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": rpc error: code = NotFound desc = could not find container \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": container with ID starting with 65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.379650 4676 scope.go:117] "RemoveContainer" containerID="3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380078 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254"} err="failed to get container status \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": rpc error: code = NotFound desc = could not find container \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": container with ID starting with 3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254 not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380116 4676 scope.go:117] "RemoveContainer" containerID="96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380380 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d"} err="failed to get container status \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": rpc error: code = NotFound desc = could not find container \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": container with ID starting with 96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380399 4676 scope.go:117] "RemoveContainer" containerID="41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380630 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a"} err="failed to get container status \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": rpc error: code = NotFound desc = could not find container \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": container with ID starting with 41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380674 4676 scope.go:117] "RemoveContainer" containerID="65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380965 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f"} err="failed to get container status \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": rpc error: code = NotFound desc = could not find container \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": container with ID starting with 65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.380999 4676 scope.go:117] "RemoveContainer" containerID="3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.381229 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254"} err="failed to get container status \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": rpc error: code = NotFound desc = could not find container \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": container with ID starting with 3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254 not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.381251 4676 scope.go:117] "RemoveContainer" containerID="96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.381511 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d"} err="failed to get container status \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": rpc error: code = NotFound desc = could not find container \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": container with ID starting with 96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.381551 4676 scope.go:117] "RemoveContainer" containerID="41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.381823 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a"} err="failed to get container status \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": rpc error: code = NotFound desc = could not find container \"41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a\": container with ID starting with 41c9e75930eda9998d33fc8913a440759a37021bb468ff72fb64b12bc59f204a not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.381847 4676 scope.go:117] "RemoveContainer" containerID="65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.382195 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f"} err="failed to get container status \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": rpc error: code = NotFound desc = could not find container \"65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f\": container with ID starting with 65080ad5bcda50aeaed1f8bcb6e8b54e56094f270d4f37342f4e65b853fe4f9f not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.382220 4676 scope.go:117] "RemoveContainer" containerID="3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.382490 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254"} err="failed to get container status \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": rpc error: code = NotFound desc = could not find container \"3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254\": container with ID starting with 3de3f163a2ff38dfbc8e702853edd53fae834aa3c9bf9a7234f68c2a92996254 not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.382512 4676 scope.go:117] "RemoveContainer" containerID="96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.383266 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d"} err="failed to get container status \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": rpc error: code = NotFound desc = could not find container \"96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d\": container with ID starting with 96c6cecc70d6567069e26c20c23771bf12f1e29d401922de93ad2ca78b6ce70d not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.383298 4676 scope.go:117] "RemoveContainer" containerID="c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.411496 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.411531 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.411548 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.411561 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drt8v\" (UniqueName: \"kubernetes.io/projected/2b6ad50b-8581-4079-9da8-1115ec1316f2-kube-api-access-drt8v\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.411574 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.411588 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b6ad50b-8581-4079-9da8-1115ec1316f2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.420086 4676 scope.go:117] "RemoveContainer" containerID="cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.441958 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b6ad50b-8581-4079-9da8-1115ec1316f2" (UID: "2b6ad50b-8581-4079-9da8-1115ec1316f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.460219 4676 scope.go:117] "RemoveContainer" containerID="c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.461711 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d\": container with ID starting with c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d not found: ID does not exist" containerID="c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.461753 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d"} err="failed to get container status \"c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d\": rpc error: code = NotFound desc = could not find container \"c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d\": container with ID starting with c7c4300ad6e03ea788b37c6c4cf4869010a010abc7ea37fb378f444c3b27031d not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.461796 4676 scope.go:117] "RemoveContainer" containerID="cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.469805 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5\": container with ID starting with cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5 not found: ID does not exist" containerID="cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.469842 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5"} err="failed to get container status \"cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5\": rpc error: code = NotFound desc = could not find container \"cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5\": container with ID starting with cfe0464163fbf3545a36b504582010e9352792a9710a3bd8cd54c580701b5bc5 not found: ID does not exist" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.506704 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-config-data" (OuterVolumeSpecName: "config-data") pod "2b6ad50b-8581-4079-9da8-1115ec1316f2" (UID: "2b6ad50b-8581-4079-9da8-1115ec1316f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.513384 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.513420 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b6ad50b-8581-4079-9da8-1115ec1316f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:02 crc kubenswrapper[4676]: W1204 15:41:02.563169 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98ccf1a8_b6c5_4f19_af89_531b204e79eb.slice/crio-e7ef067e90a6cc1e222dda9a3552901a928c909ef4af954da84a1cea86b745c3 WatchSource:0}: Error finding container e7ef067e90a6cc1e222dda9a3552901a928c909ef4af954da84a1cea86b745c3: Status 404 returned error can't find the container with id e7ef067e90a6cc1e222dda9a3552901a928c909ef4af954da84a1cea86b745c3 Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.569675 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.699992 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.721528 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.745223 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.757858 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.769948 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.770438 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-httpd" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770452 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-httpd" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.770462 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-notification-agent" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770469 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-notification-agent" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.770482 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-central-agent" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770488 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-central-agent" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.770506 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-log" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770513 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-log" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.770533 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="proxy-httpd" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770538 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="proxy-httpd" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.770548 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="sg-core" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770554 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="sg-core" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770741 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-notification-agent" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770757 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-log" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770779 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="sg-core" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770790 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="proxy-httpd" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770800 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" containerName="ceilometer-central-agent" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.770809 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" containerName="glance-httpd" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.771938 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.777214 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.777306 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.798730 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.798772 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.802241 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.805255 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.809275 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.832543 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.901246 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.901296 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.902234 4676 scope.go:117] "RemoveContainer" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:41:02 crc kubenswrapper[4676]: E1204 15:41:02.902516 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.920650 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.920705 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-config-data\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.920737 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6zj\" (UniqueName: \"kubernetes.io/projected/5a9c189a-a32b-46fc-99ef-c643d9959aa5-kube-api-access-ms6zj\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.920988 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9c189a-a32b-46fc-99ef-c643d9959aa5-logs\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921158 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-scripts\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921223 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7kmv\" (UniqueName: \"kubernetes.io/projected/8267925c-3b72-443e-b352-437e7014120c-kube-api-access-f7kmv\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921363 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921419 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-log-httpd\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921466 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a9c189a-a32b-46fc-99ef-c643d9959aa5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921587 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921648 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-run-httpd\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921756 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.921993 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:02 crc kubenswrapper[4676]: I1204 15:41:02.922071 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024115 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-config-data\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024202 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6zj\" (UniqueName: \"kubernetes.io/projected/5a9c189a-a32b-46fc-99ef-c643d9959aa5-kube-api-access-ms6zj\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024222 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9c189a-a32b-46fc-99ef-c643d9959aa5-logs\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024266 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-scripts\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024288 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7kmv\" (UniqueName: \"kubernetes.io/projected/8267925c-3b72-443e-b352-437e7014120c-kube-api-access-f7kmv\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024312 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024327 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-log-httpd\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024344 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024365 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a9c189a-a32b-46fc-99ef-c643d9959aa5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024386 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024409 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-run-httpd\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024451 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024487 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024512 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.024866 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-log-httpd\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.025144 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-run-httpd\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.025479 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a9c189a-a32b-46fc-99ef-c643d9959aa5-logs\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.025792 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5a9c189a-a32b-46fc-99ef-c643d9959aa5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.025819 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.034787 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-scripts\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.035440 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.035526 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-scripts\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.035569 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-config-data\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.036197 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.036265 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.036568 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.045878 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6zj\" (UniqueName: \"kubernetes.io/projected/5a9c189a-a32b-46fc-99ef-c643d9959aa5-kube-api-access-ms6zj\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.054710 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7kmv\" (UniqueName: \"kubernetes.io/projected/8267925c-3b72-443e-b352-437e7014120c-kube-api-access-f7kmv\") pod \"ceilometer-0\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.055306 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a9c189a-a32b-46fc-99ef-c643d9959aa5-config-data\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.075764 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"5a9c189a-a32b-46fc-99ef-c643d9959aa5\") " pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.103591 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.128487 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.273864 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98ccf1a8-b6c5-4f19-af89-531b204e79eb","Type":"ContainerStarted","Data":"e7ef067e90a6cc1e222dda9a3552901a928c909ef4af954da84a1cea86b745c3"} Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.397667 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b6ad50b-8581-4079-9da8-1115ec1316f2" path="/var/lib/kubelet/pods/2b6ad50b-8581-4079-9da8-1115ec1316f2/volumes" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.398828 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e381383e-d565-4243-92d1-d9ea82e7cad8" path="/var/lib/kubelet/pods/e381383e-d565-4243-92d1-d9ea82e7cad8/volumes" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.401069 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f915ebe5-d216-4de0-ad9e-506664c6e27f" path="/var/lib/kubelet/pods/f915ebe5-d216-4de0-ad9e-506664c6e27f/volumes" Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.676734 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.706221 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 04 15:41:03 crc kubenswrapper[4676]: I1204 15:41:03.833571 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.303464 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a9c189a-a32b-46fc-99ef-c643d9959aa5","Type":"ContainerStarted","Data":"d365a76432c3436f8f918f7208cc69d6202fd250894c9537150373d7cebb7ad8"} Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.305523 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerStarted","Data":"d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1"} Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.305556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerStarted","Data":"7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271"} Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.305566 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerStarted","Data":"5945325aff7d239d419ac4d972a21e72f4e51d059f112293f3afb5254e1e4b99"} Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.308329 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98ccf1a8-b6c5-4f19-af89-531b204e79eb","Type":"ContainerStarted","Data":"4538cc75ae6cfd3da94fd207d24af7704d51304cf0e1e8586aef5b7f039b9a4e"} Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.308380 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"98ccf1a8-b6c5-4f19-af89-531b204e79eb","Type":"ContainerStarted","Data":"48c0f43676bd981ae807fd7acf83d4924e6e16a547b0b49de6bceee6751087d0"} Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.338478 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.3384603999999998 podStartE2EDuration="3.3384604s" podCreationTimestamp="2025-12-04 15:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:41:04.327316295 +0000 UTC m=+1271.761986152" watchObservedRunningTime="2025-12-04 15:41:04.3384604 +0000 UTC m=+1271.773130257" Dec 04 15:41:04 crc kubenswrapper[4676]: I1204 15:41:04.643417 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:41:05 crc kubenswrapper[4676]: I1204 15:41:05.319764 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerStarted","Data":"f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481"} Dec 04 15:41:05 crc kubenswrapper[4676]: I1204 15:41:05.324578 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a9c189a-a32b-46fc-99ef-c643d9959aa5","Type":"ContainerStarted","Data":"5ad67d786d69e6e52363e1e04ff10b5d5c0c3dd4d66517ba42efff7397f8d53c"} Dec 04 15:41:05 crc kubenswrapper[4676]: I1204 15:41:05.324631 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5a9c189a-a32b-46fc-99ef-c643d9959aa5","Type":"ContainerStarted","Data":"e0e5490c528dbc5e1fd337d4735eead18f83c62f001678c5313fe3096a469acb"} Dec 04 15:41:05 crc kubenswrapper[4676]: I1204 15:41:05.351517 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.351494581 podStartE2EDuration="3.351494581s" podCreationTimestamp="2025-12-04 15:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:41:05.341346345 +0000 UTC m=+1272.776016212" watchObservedRunningTime="2025-12-04 15:41:05.351494581 +0000 UTC m=+1272.786164438" Dec 04 15:41:07 crc kubenswrapper[4676]: I1204 15:41:07.347011 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerStarted","Data":"1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62"} Dec 04 15:41:07 crc kubenswrapper[4676]: I1204 15:41:07.347543 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 15:41:07 crc kubenswrapper[4676]: I1204 15:41:07.347249 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-central-agent" containerID="cri-o://7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271" gracePeriod=30 Dec 04 15:41:07 crc kubenswrapper[4676]: I1204 15:41:07.347249 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="proxy-httpd" containerID="cri-o://1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62" gracePeriod=30 Dec 04 15:41:07 crc kubenswrapper[4676]: I1204 15:41:07.347280 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-notification-agent" containerID="cri-o://d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1" gracePeriod=30 Dec 04 15:41:07 crc kubenswrapper[4676]: I1204 15:41:07.347272 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="sg-core" containerID="cri-o://f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481" gracePeriod=30 Dec 04 15:41:07 crc kubenswrapper[4676]: I1204 15:41:07.378813 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.942694994 podStartE2EDuration="5.378788128s" podCreationTimestamp="2025-12-04 15:41:02 +0000 UTC" firstStartedPulling="2025-12-04 15:41:03.693706273 +0000 UTC m=+1271.128376130" lastFinishedPulling="2025-12-04 15:41:06.129799407 +0000 UTC m=+1273.564469264" observedRunningTime="2025-12-04 15:41:07.371589217 +0000 UTC m=+1274.806259084" watchObservedRunningTime="2025-12-04 15:41:07.378788128 +0000 UTC m=+1274.813457985" Dec 04 15:41:08 crc kubenswrapper[4676]: I1204 15:41:08.358116 4676 generic.go:334] "Generic (PLEG): container finished" podID="8267925c-3b72-443e-b352-437e7014120c" containerID="1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62" exitCode=0 Dec 04 15:41:08 crc kubenswrapper[4676]: I1204 15:41:08.358150 4676 generic.go:334] "Generic (PLEG): container finished" podID="8267925c-3b72-443e-b352-437e7014120c" containerID="f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481" exitCode=2 Dec 04 15:41:08 crc kubenswrapper[4676]: I1204 15:41:08.358160 4676 generic.go:334] "Generic (PLEG): container finished" podID="8267925c-3b72-443e-b352-437e7014120c" containerID="d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1" exitCode=0 Dec 04 15:41:08 crc kubenswrapper[4676]: I1204 15:41:08.358185 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerDied","Data":"1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62"} Dec 04 15:41:08 crc kubenswrapper[4676]: I1204 15:41:08.358227 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerDied","Data":"f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481"} Dec 04 15:41:08 crc kubenswrapper[4676]: I1204 15:41:08.358238 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerDied","Data":"d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1"} Dec 04 15:41:11 crc kubenswrapper[4676]: I1204 15:41:11.752697 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-68bd568fd5-srw6v" Dec 04 15:41:11 crc kubenswrapper[4676]: I1204 15:41:11.814505 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56cc94d674-46bbd"] Dec 04 15:41:11 crc kubenswrapper[4676]: I1204 15:41:11.814857 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56cc94d674-46bbd" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-api" containerID="cri-o://5017307d15c2cd9ba68144ecc2685519cdeaa90c5dd7a5f2078ac59069785e65" gracePeriod=30 Dec 04 15:41:11 crc kubenswrapper[4676]: I1204 15:41:11.815522 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56cc94d674-46bbd" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-httpd" containerID="cri-o://c5f600a18abd0588198fbe3de7b1123c4aa5da8776a404ec4155c4f7ce3c9cd7" gracePeriod=30 Dec 04 15:41:11 crc kubenswrapper[4676]: I1204 15:41:11.956477 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:11 crc kubenswrapper[4676]: I1204 15:41:11.956806 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:12 crc kubenswrapper[4676]: I1204 15:41:12.012378 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:12 crc kubenswrapper[4676]: I1204 15:41:12.017565 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:12 crc kubenswrapper[4676]: I1204 15:41:12.406602 4676 generic.go:334] "Generic (PLEG): container finished" podID="853263fd-fa07-43e9-9855-fc057772d052" containerID="c5f600a18abd0588198fbe3de7b1123c4aa5da8776a404ec4155c4f7ce3c9cd7" exitCode=0 Dec 04 15:41:12 crc kubenswrapper[4676]: I1204 15:41:12.406681 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56cc94d674-46bbd" event={"ID":"853263fd-fa07-43e9-9855-fc057772d052","Type":"ContainerDied","Data":"c5f600a18abd0588198fbe3de7b1123c4aa5da8776a404ec4155c4f7ce3c9cd7"} Dec 04 15:41:12 crc kubenswrapper[4676]: I1204 15:41:12.407047 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:12 crc kubenswrapper[4676]: I1204 15:41:12.407079 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:13 crc kubenswrapper[4676]: I1204 15:41:13.104138 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 15:41:13 crc kubenswrapper[4676]: I1204 15:41:13.104186 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 04 15:41:13 crc kubenswrapper[4676]: I1204 15:41:13.142935 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 15:41:13 crc kubenswrapper[4676]: I1204 15:41:13.160429 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 04 15:41:13 crc kubenswrapper[4676]: I1204 15:41:13.415874 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 15:41:13 crc kubenswrapper[4676]: I1204 15:41:13.417536 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 04 15:41:14 crc kubenswrapper[4676]: I1204 15:41:14.471738 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:14 crc kubenswrapper[4676]: I1204 15:41:14.472110 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:41:14 crc kubenswrapper[4676]: I1204 15:41:14.483469 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 04 15:41:14 crc kubenswrapper[4676]: I1204 15:41:14.999016 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.154187 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-sg-core-conf-yaml\") pod \"8267925c-3b72-443e-b352-437e7014120c\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.154351 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7kmv\" (UniqueName: \"kubernetes.io/projected/8267925c-3b72-443e-b352-437e7014120c-kube-api-access-f7kmv\") pod \"8267925c-3b72-443e-b352-437e7014120c\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.154488 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-run-httpd\") pod \"8267925c-3b72-443e-b352-437e7014120c\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.154628 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-log-httpd\") pod \"8267925c-3b72-443e-b352-437e7014120c\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.154722 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-scripts\") pod \"8267925c-3b72-443e-b352-437e7014120c\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.154795 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-config-data\") pod \"8267925c-3b72-443e-b352-437e7014120c\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.154837 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-combined-ca-bundle\") pod \"8267925c-3b72-443e-b352-437e7014120c\" (UID: \"8267925c-3b72-443e-b352-437e7014120c\") " Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.155489 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8267925c-3b72-443e-b352-437e7014120c" (UID: "8267925c-3b72-443e-b352-437e7014120c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.155859 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.156117 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8267925c-3b72-443e-b352-437e7014120c" (UID: "8267925c-3b72-443e-b352-437e7014120c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.170029 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-scripts" (OuterVolumeSpecName: "scripts") pod "8267925c-3b72-443e-b352-437e7014120c" (UID: "8267925c-3b72-443e-b352-437e7014120c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.179323 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8267925c-3b72-443e-b352-437e7014120c-kube-api-access-f7kmv" (OuterVolumeSpecName: "kube-api-access-f7kmv") pod "8267925c-3b72-443e-b352-437e7014120c" (UID: "8267925c-3b72-443e-b352-437e7014120c"). InnerVolumeSpecName "kube-api-access-f7kmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.194559 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8267925c-3b72-443e-b352-437e7014120c" (UID: "8267925c-3b72-443e-b352-437e7014120c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.258410 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.258681 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7kmv\" (UniqueName: \"kubernetes.io/projected/8267925c-3b72-443e-b352-437e7014120c-kube-api-access-f7kmv\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.258757 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8267925c-3b72-443e-b352-437e7014120c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.258833 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.273488 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8267925c-3b72-443e-b352-437e7014120c" (UID: "8267925c-3b72-443e-b352-437e7014120c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.288139 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-config-data" (OuterVolumeSpecName: "config-data") pod "8267925c-3b72-443e-b352-437e7014120c" (UID: "8267925c-3b72-443e-b352-437e7014120c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.360980 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.361191 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8267925c-3b72-443e-b352-437e7014120c-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.443357 4676 generic.go:334] "Generic (PLEG): container finished" podID="8267925c-3b72-443e-b352-437e7014120c" containerID="7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271" exitCode=0 Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.443468 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.443477 4676 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.444487 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.445054 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerDied","Data":"7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271"} Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.445087 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8267925c-3b72-443e-b352-437e7014120c","Type":"ContainerDied","Data":"5945325aff7d239d419ac4d972a21e72f4e51d059f112293f3afb5254e1e4b99"} Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.445155 4676 scope.go:117] "RemoveContainer" containerID="1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.615510 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.627500 4676 scope.go:117] "RemoveContainer" containerID="f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.630035 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.650982 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.651497 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="sg-core" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.651542 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="sg-core" Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.651599 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-notification-agent" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.651613 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-notification-agent" Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.651623 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="proxy-httpd" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.651631 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="proxy-httpd" Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.651689 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-central-agent" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.651699 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-central-agent" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.653094 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-central-agent" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.653115 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="sg-core" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.653130 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="proxy-httpd" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.653148 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8267925c-3b72-443e-b352-437e7014120c" containerName="ceilometer-notification-agent" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.655184 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.658700 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.659136 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.684147 4676 scope.go:117] "RemoveContainer" containerID="d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.684310 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.708321 4676 scope.go:117] "RemoveContainer" containerID="7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.737488 4676 scope.go:117] "RemoveContainer" containerID="1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62" Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.737987 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62\": container with ID starting with 1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62 not found: ID does not exist" containerID="1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.738045 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62"} err="failed to get container status \"1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62\": rpc error: code = NotFound desc = could not find container \"1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62\": container with ID starting with 1935e4ffd5a523554926f889c2cbae147da9a97a60ea7d26c9d58d57d57bcf62 not found: ID does not exist" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.738090 4676 scope.go:117] "RemoveContainer" containerID="f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481" Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.738410 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481\": container with ID starting with f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481 not found: ID does not exist" containerID="f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.738440 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481"} err="failed to get container status \"f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481\": rpc error: code = NotFound desc = could not find container \"f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481\": container with ID starting with f87d66da5d1264504df2c2d4c0fb5a01b7766a5cb3629172a5f1bd6361edf481 not found: ID does not exist" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.738507 4676 scope.go:117] "RemoveContainer" containerID="d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1" Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.738805 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1\": container with ID starting with d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1 not found: ID does not exist" containerID="d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.738841 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1"} err="failed to get container status \"d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1\": rpc error: code = NotFound desc = could not find container \"d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1\": container with ID starting with d3f31b75ec9d3c649105e4662210ec050e6105768e46b0e408ce47010ad85fe1 not found: ID does not exist" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.738866 4676 scope.go:117] "RemoveContainer" containerID="7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271" Dec 04 15:41:15 crc kubenswrapper[4676]: E1204 15:41:15.739213 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271\": container with ID starting with 7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271 not found: ID does not exist" containerID="7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.739291 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271"} err="failed to get container status \"7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271\": rpc error: code = NotFound desc = could not find container \"7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271\": container with ID starting with 7af2f19cc2f7fe1dee037805a7e325e8a9a7114aa3af5696fecb1b5cdbddf271 not found: ID does not exist" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.764590 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9fn9\" (UniqueName: \"kubernetes.io/projected/8221ffd4-6931-416c-abfe-bf0a4c441b7a-kube-api-access-b9fn9\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.764662 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.764715 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-config-data\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.764747 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-run-httpd\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.764774 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-scripts\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.764790 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-log-httpd\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.764821 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.772640 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.866314 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9fn9\" (UniqueName: \"kubernetes.io/projected/8221ffd4-6931-416c-abfe-bf0a4c441b7a-kube-api-access-b9fn9\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.866855 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.866921 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-config-data\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.866957 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-run-httpd\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.866994 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-scripts\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.867024 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-log-httpd\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.867059 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.867554 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-run-httpd\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.867722 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-log-httpd\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.872241 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-config-data\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.872856 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-scripts\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.874779 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.895514 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9fn9\" (UniqueName: \"kubernetes.io/projected/8221ffd4-6931-416c-abfe-bf0a4c441b7a-kube-api-access-b9fn9\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.896380 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " pod="openstack/ceilometer-0" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.909808 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-29dqv"] Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.911316 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-29dqv" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.925376 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-29dqv"] Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.970324 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck72g\" (UniqueName: \"kubernetes.io/projected/925dde31-bb8c-4306-9dc9-5a7119e33f4e-kube-api-access-ck72g\") pod \"nova-api-db-create-29dqv\" (UID: \"925dde31-bb8c-4306-9dc9-5a7119e33f4e\") " pod="openstack/nova-api-db-create-29dqv" Dec 04 15:41:15 crc kubenswrapper[4676]: I1204 15:41:15.978588 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.028144 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.028469 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.037836 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9rfzz"] Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.039651 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9rfzz" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.060069 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9rfzz"] Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.076442 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck72g\" (UniqueName: \"kubernetes.io/projected/925dde31-bb8c-4306-9dc9-5a7119e33f4e-kube-api-access-ck72g\") pod \"nova-api-db-create-29dqv\" (UID: \"925dde31-bb8c-4306-9dc9-5a7119e33f4e\") " pod="openstack/nova-api-db-create-29dqv" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.105662 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck72g\" (UniqueName: \"kubernetes.io/projected/925dde31-bb8c-4306-9dc9-5a7119e33f4e-kube-api-access-ck72g\") pod \"nova-api-db-create-29dqv\" (UID: \"925dde31-bb8c-4306-9dc9-5a7119e33f4e\") " pod="openstack/nova-api-db-create-29dqv" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.110813 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.134924 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-29dqv" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.178289 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdv9c\" (UniqueName: \"kubernetes.io/projected/ad2e276b-e6c3-4302-a9d8-b63830394431-kube-api-access-vdv9c\") pod \"nova-cell0-db-create-9rfzz\" (UID: \"ad2e276b-e6c3-4302-a9d8-b63830394431\") " pod="openstack/nova-cell0-db-create-9rfzz" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.213217 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-snxtt"] Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.214783 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-snxtt" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.238532 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-snxtt"] Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.291597 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9kc\" (UniqueName: \"kubernetes.io/projected/1a6646cc-68b8-4672-be21-58ad781dd616-kube-api-access-xm9kc\") pod \"nova-cell1-db-create-snxtt\" (UID: \"1a6646cc-68b8-4672-be21-58ad781dd616\") " pod="openstack/nova-cell1-db-create-snxtt" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.291961 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdv9c\" (UniqueName: \"kubernetes.io/projected/ad2e276b-e6c3-4302-a9d8-b63830394431-kube-api-access-vdv9c\") pod \"nova-cell0-db-create-9rfzz\" (UID: \"ad2e276b-e6c3-4302-a9d8-b63830394431\") " pod="openstack/nova-cell0-db-create-9rfzz" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.319955 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdv9c\" (UniqueName: \"kubernetes.io/projected/ad2e276b-e6c3-4302-a9d8-b63830394431-kube-api-access-vdv9c\") pod \"nova-cell0-db-create-9rfzz\" (UID: \"ad2e276b-e6c3-4302-a9d8-b63830394431\") " pod="openstack/nova-cell0-db-create-9rfzz" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.394592 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9kc\" (UniqueName: \"kubernetes.io/projected/1a6646cc-68b8-4672-be21-58ad781dd616-kube-api-access-xm9kc\") pod \"nova-cell1-db-create-snxtt\" (UID: \"1a6646cc-68b8-4672-be21-58ad781dd616\") " pod="openstack/nova-cell1-db-create-snxtt" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.454591 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9kc\" (UniqueName: \"kubernetes.io/projected/1a6646cc-68b8-4672-be21-58ad781dd616-kube-api-access-xm9kc\") pod \"nova-cell1-db-create-snxtt\" (UID: \"1a6646cc-68b8-4672-be21-58ad781dd616\") " pod="openstack/nova-cell1-db-create-snxtt" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.456671 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9rfzz" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.578645 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-snxtt" Dec 04 15:41:16 crc kubenswrapper[4676]: I1204 15:41:16.857763 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:17 crc kubenswrapper[4676]: I1204 15:41:17.084567 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-29dqv"] Dec 04 15:41:17 crc kubenswrapper[4676]: I1204 15:41:17.418570 4676 scope.go:117] "RemoveContainer" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:41:17 crc kubenswrapper[4676]: E1204 15:41:17.419020 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 40s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(aeea1eb2-6952-4bef-a6f3-7dd8636ff74a)\"" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" Dec 04 15:41:17 crc kubenswrapper[4676]: I1204 15:41:17.486609 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8267925c-3b72-443e-b352-437e7014120c" path="/var/lib/kubelet/pods/8267925c-3b72-443e-b352-437e7014120c/volumes" Dec 04 15:41:17 crc kubenswrapper[4676]: I1204 15:41:17.504846 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9rfzz"] Dec 04 15:41:17 crc kubenswrapper[4676]: I1204 15:41:17.511728 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-29dqv" event={"ID":"925dde31-bb8c-4306-9dc9-5a7119e33f4e","Type":"ContainerStarted","Data":"313cf1a2ffeed12f3f95deed06b08d6d93bd056755df848e83d09eda10280bb6"} Dec 04 15:41:17 crc kubenswrapper[4676]: I1204 15:41:17.524756 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerStarted","Data":"bfbbebafbd3572af42e7ab27d90f826f60ac5a659b1bbd3d481448239982f842"} Dec 04 15:41:17 crc kubenswrapper[4676]: I1204 15:41:17.945255 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-snxtt"] Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.539215 4676 generic.go:334] "Generic (PLEG): container finished" podID="925dde31-bb8c-4306-9dc9-5a7119e33f4e" containerID="8288f34a40b8a3e176fd70f0e56ba568112913b863244d7378ed50cf20d7710c" exitCode=0 Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.539561 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-29dqv" event={"ID":"925dde31-bb8c-4306-9dc9-5a7119e33f4e","Type":"ContainerDied","Data":"8288f34a40b8a3e176fd70f0e56ba568112913b863244d7378ed50cf20d7710c"} Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.545498 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-snxtt" event={"ID":"1a6646cc-68b8-4672-be21-58ad781dd616","Type":"ContainerStarted","Data":"5c6a8ef93376487ba0e290a902147d5cf2bff7af2601cb4ae1373029517c4c6c"} Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.545541 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-snxtt" event={"ID":"1a6646cc-68b8-4672-be21-58ad781dd616","Type":"ContainerStarted","Data":"dcd2a5929be46cb3f339b1cf35a52fcd846a783e307d8833b03cf43d7bbe4268"} Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.552687 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerStarted","Data":"619024e3a34effa642e1116e56a9ab51b73a34e02409595f29e6205fddccb644"} Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.552743 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerStarted","Data":"5e73396ad1cc4b0ace8797fe26901e61a00c0ad55b59615922b6a5ecc7498573"} Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.569971 4676 generic.go:334] "Generic (PLEG): container finished" podID="ad2e276b-e6c3-4302-a9d8-b63830394431" containerID="7c3b3b07cdc851357180f211f0d2ecddd8d41d8a0e7435837731585a4e49732b" exitCode=0 Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.570024 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9rfzz" event={"ID":"ad2e276b-e6c3-4302-a9d8-b63830394431","Type":"ContainerDied","Data":"7c3b3b07cdc851357180f211f0d2ecddd8d41d8a0e7435837731585a4e49732b"} Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.570057 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9rfzz" event={"ID":"ad2e276b-e6c3-4302-a9d8-b63830394431","Type":"ContainerStarted","Data":"239ad409301130267cf733c52abf204685c83ba0d372336b052a0415a9576d73"} Dec 04 15:41:18 crc kubenswrapper[4676]: I1204 15:41:18.596192 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-snxtt" podStartSLOduration=2.596146425 podStartE2EDuration="2.596146425s" podCreationTimestamp="2025-12-04 15:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:41:18.585659499 +0000 UTC m=+1286.020329356" watchObservedRunningTime="2025-12-04 15:41:18.596146425 +0000 UTC m=+1286.030816282" Dec 04 15:41:19 crc kubenswrapper[4676]: I1204 15:41:19.581245 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerStarted","Data":"34d8976aeadb642d2fff1879d582b70ebeafabbea2efb16561257aa1765964ca"} Dec 04 15:41:19 crc kubenswrapper[4676]: I1204 15:41:19.584169 4676 generic.go:334] "Generic (PLEG): container finished" podID="1a6646cc-68b8-4672-be21-58ad781dd616" containerID="5c6a8ef93376487ba0e290a902147d5cf2bff7af2601cb4ae1373029517c4c6c" exitCode=0 Dec 04 15:41:19 crc kubenswrapper[4676]: I1204 15:41:19.584278 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-snxtt" event={"ID":"1a6646cc-68b8-4672-be21-58ad781dd616","Type":"ContainerDied","Data":"5c6a8ef93376487ba0e290a902147d5cf2bff7af2601cb4ae1373029517c4c6c"} Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.269034 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9rfzz" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.277648 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-29dqv" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.346014 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdv9c\" (UniqueName: \"kubernetes.io/projected/ad2e276b-e6c3-4302-a9d8-b63830394431-kube-api-access-vdv9c\") pod \"ad2e276b-e6c3-4302-a9d8-b63830394431\" (UID: \"ad2e276b-e6c3-4302-a9d8-b63830394431\") " Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.346140 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck72g\" (UniqueName: \"kubernetes.io/projected/925dde31-bb8c-4306-9dc9-5a7119e33f4e-kube-api-access-ck72g\") pod \"925dde31-bb8c-4306-9dc9-5a7119e33f4e\" (UID: \"925dde31-bb8c-4306-9dc9-5a7119e33f4e\") " Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.486461 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925dde31-bb8c-4306-9dc9-5a7119e33f4e-kube-api-access-ck72g" (OuterVolumeSpecName: "kube-api-access-ck72g") pod "925dde31-bb8c-4306-9dc9-5a7119e33f4e" (UID: "925dde31-bb8c-4306-9dc9-5a7119e33f4e"). InnerVolumeSpecName "kube-api-access-ck72g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.487097 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2e276b-e6c3-4302-a9d8-b63830394431-kube-api-access-vdv9c" (OuterVolumeSpecName: "kube-api-access-vdv9c") pod "ad2e276b-e6c3-4302-a9d8-b63830394431" (UID: "ad2e276b-e6c3-4302-a9d8-b63830394431"). InnerVolumeSpecName "kube-api-access-vdv9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.552761 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdv9c\" (UniqueName: \"kubernetes.io/projected/ad2e276b-e6c3-4302-a9d8-b63830394431-kube-api-access-vdv9c\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.552797 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck72g\" (UniqueName: \"kubernetes.io/projected/925dde31-bb8c-4306-9dc9-5a7119e33f4e-kube-api-access-ck72g\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.603429 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9rfzz" event={"ID":"ad2e276b-e6c3-4302-a9d8-b63830394431","Type":"ContainerDied","Data":"239ad409301130267cf733c52abf204685c83ba0d372336b052a0415a9576d73"} Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.603481 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239ad409301130267cf733c52abf204685c83ba0d372336b052a0415a9576d73" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.603541 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9rfzz" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.746315 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-29dqv" Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.748019 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-29dqv" event={"ID":"925dde31-bb8c-4306-9dc9-5a7119e33f4e","Type":"ContainerDied","Data":"313cf1a2ffeed12f3f95deed06b08d6d93bd056755df848e83d09eda10280bb6"} Dec 04 15:41:20 crc kubenswrapper[4676]: I1204 15:41:20.748071 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313cf1a2ffeed12f3f95deed06b08d6d93bd056755df848e83d09eda10280bb6" Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.341697 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-snxtt" Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.356874 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm9kc\" (UniqueName: \"kubernetes.io/projected/1a6646cc-68b8-4672-be21-58ad781dd616-kube-api-access-xm9kc\") pod \"1a6646cc-68b8-4672-be21-58ad781dd616\" (UID: \"1a6646cc-68b8-4672-be21-58ad781dd616\") " Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.381172 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6646cc-68b8-4672-be21-58ad781dd616-kube-api-access-xm9kc" (OuterVolumeSpecName: "kube-api-access-xm9kc") pod "1a6646cc-68b8-4672-be21-58ad781dd616" (UID: "1a6646cc-68b8-4672-be21-58ad781dd616"). InnerVolumeSpecName "kube-api-access-xm9kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.465206 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm9kc\" (UniqueName: \"kubernetes.io/projected/1a6646cc-68b8-4672-be21-58ad781dd616-kube-api-access-xm9kc\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.756504 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-snxtt" event={"ID":"1a6646cc-68b8-4672-be21-58ad781dd616","Type":"ContainerDied","Data":"dcd2a5929be46cb3f339b1cf35a52fcd846a783e307d8833b03cf43d7bbe4268"} Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.756545 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd2a5929be46cb3f339b1cf35a52fcd846a783e307d8833b03cf43d7bbe4268" Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.756519 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-snxtt" Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.759284 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerStarted","Data":"acdb243aa13217a33e82fc1675df0c8a3eec896b26e577e06ccc51da0e81faec"} Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.759553 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 15:41:21 crc kubenswrapper[4676]: I1204 15:41:21.787354 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.55625933 podStartE2EDuration="6.787333307s" podCreationTimestamp="2025-12-04 15:41:15 +0000 UTC" firstStartedPulling="2025-12-04 15:41:16.865100458 +0000 UTC m=+1284.299770315" lastFinishedPulling="2025-12-04 15:41:20.096174435 +0000 UTC m=+1287.530844292" observedRunningTime="2025-12-04 15:41:21.781800706 +0000 UTC m=+1289.216470563" watchObservedRunningTime="2025-12-04 15:41:21.787333307 +0000 UTC m=+1289.222003164" Dec 04 15:41:27 crc kubenswrapper[4676]: I1204 15:41:27.828573 4676 generic.go:334] "Generic (PLEG): container finished" podID="853263fd-fa07-43e9-9855-fc057772d052" containerID="5017307d15c2cd9ba68144ecc2685519cdeaa90c5dd7a5f2078ac59069785e65" exitCode=0 Dec 04 15:41:27 crc kubenswrapper[4676]: I1204 15:41:27.829200 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56cc94d674-46bbd" event={"ID":"853263fd-fa07-43e9-9855-fc057772d052","Type":"ContainerDied","Data":"5017307d15c2cd9ba68144ecc2685519cdeaa90c5dd7a5f2078ac59069785e65"} Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.105699 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.294527 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-ovndb-tls-certs\") pod \"853263fd-fa07-43e9-9855-fc057772d052\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.295662 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-combined-ca-bundle\") pod \"853263fd-fa07-43e9-9855-fc057772d052\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.295814 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2w8l\" (UniqueName: \"kubernetes.io/projected/853263fd-fa07-43e9-9855-fc057772d052-kube-api-access-j2w8l\") pod \"853263fd-fa07-43e9-9855-fc057772d052\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.296048 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-config\") pod \"853263fd-fa07-43e9-9855-fc057772d052\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.296125 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-httpd-config\") pod \"853263fd-fa07-43e9-9855-fc057772d052\" (UID: \"853263fd-fa07-43e9-9855-fc057772d052\") " Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.300864 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/853263fd-fa07-43e9-9855-fc057772d052-kube-api-access-j2w8l" (OuterVolumeSpecName: "kube-api-access-j2w8l") pod "853263fd-fa07-43e9-9855-fc057772d052" (UID: "853263fd-fa07-43e9-9855-fc057772d052"). InnerVolumeSpecName "kube-api-access-j2w8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.301183 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "853263fd-fa07-43e9-9855-fc057772d052" (UID: "853263fd-fa07-43e9-9855-fc057772d052"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.349037 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-config" (OuterVolumeSpecName: "config") pod "853263fd-fa07-43e9-9855-fc057772d052" (UID: "853263fd-fa07-43e9-9855-fc057772d052"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.389339 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "853263fd-fa07-43e9-9855-fc057772d052" (UID: "853263fd-fa07-43e9-9855-fc057772d052"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.393015 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "853263fd-fa07-43e9-9855-fc057772d052" (UID: "853263fd-fa07-43e9-9855-fc057772d052"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.399251 4676 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.399290 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.399305 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2w8l\" (UniqueName: \"kubernetes.io/projected/853263fd-fa07-43e9-9855-fc057772d052-kube-api-access-j2w8l\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.399335 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.399352 4676 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/853263fd-fa07-43e9-9855-fc057772d052-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.839260 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56cc94d674-46bbd" event={"ID":"853263fd-fa07-43e9-9855-fc057772d052","Type":"ContainerDied","Data":"f33f0e5c98947a33319dc3b7b3de1e8f1dda4691de801ba67da15b581199db76"} Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.839494 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56cc94d674-46bbd" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.839553 4676 scope.go:117] "RemoveContainer" containerID="c5f600a18abd0588198fbe3de7b1123c4aa5da8776a404ec4155c4f7ce3c9cd7" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.884792 4676 scope.go:117] "RemoveContainer" containerID="5017307d15c2cd9ba68144ecc2685519cdeaa90c5dd7a5f2078ac59069785e65" Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.897799 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56cc94d674-46bbd"] Dec 04 15:41:28 crc kubenswrapper[4676]: I1204 15:41:28.904347 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56cc94d674-46bbd"] Dec 04 15:41:29 crc kubenswrapper[4676]: I1204 15:41:29.398165 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="853263fd-fa07-43e9-9855-fc057772d052" path="/var/lib/kubelet/pods/853263fd-fa07-43e9-9855-fc057772d052/volumes" Dec 04 15:41:30 crc kubenswrapper[4676]: I1204 15:41:30.385277 4676 scope.go:117] "RemoveContainer" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:41:30 crc kubenswrapper[4676]: I1204 15:41:30.872520 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerStarted","Data":"1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3"} Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.087810 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.095249 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-central-agent" containerID="cri-o://5e73396ad1cc4b0ace8797fe26901e61a00c0ad55b59615922b6a5ecc7498573" gracePeriod=30 Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.095417 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="proxy-httpd" containerID="cri-o://acdb243aa13217a33e82fc1675df0c8a3eec896b26e577e06ccc51da0e81faec" gracePeriod=30 Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.095459 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="sg-core" containerID="cri-o://34d8976aeadb642d2fff1879d582b70ebeafabbea2efb16561257aa1765964ca" gracePeriod=30 Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.095506 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-notification-agent" containerID="cri-o://619024e3a34effa642e1116e56a9ab51b73a34e02409595f29e6205fddccb644" gracePeriod=30 Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.206530 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.191:3000/\": read tcp 10.217.0.2:44282->10.217.0.191:3000: read: connection reset by peer" Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.901124 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.901441 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:32 crc kubenswrapper[4676]: I1204 15:41:32.945237 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.150521 4676 generic.go:334] "Generic (PLEG): container finished" podID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerID="acdb243aa13217a33e82fc1675df0c8a3eec896b26e577e06ccc51da0e81faec" exitCode=0 Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.150548 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerDied","Data":"acdb243aa13217a33e82fc1675df0c8a3eec896b26e577e06ccc51da0e81faec"} Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.150594 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerDied","Data":"34d8976aeadb642d2fff1879d582b70ebeafabbea2efb16561257aa1765964ca"} Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.150558 4676 generic.go:334] "Generic (PLEG): container finished" podID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerID="34d8976aeadb642d2fff1879d582b70ebeafabbea2efb16561257aa1765964ca" exitCode=2 Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.150620 4676 generic.go:334] "Generic (PLEG): container finished" podID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerID="5e73396ad1cc4b0ace8797fe26901e61a00c0ad55b59615922b6a5ecc7498573" exitCode=0 Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.150645 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerDied","Data":"5e73396ad1cc4b0ace8797fe26901e61a00c0ad55b59615922b6a5ecc7498573"} Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.182283 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:33 crc kubenswrapper[4676]: I1204 15:41:33.226959 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:41:35 crc kubenswrapper[4676]: I1204 15:41:35.168934 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" containerID="cri-o://1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3" gracePeriod=30 Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.156426 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2c40-account-create-4hc7f"] Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.157447 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925dde31-bb8c-4306-9dc9-5a7119e33f4e" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157466 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="925dde31-bb8c-4306-9dc9-5a7119e33f4e" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.157487 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-httpd" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157496 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-httpd" Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.157518 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6646cc-68b8-4672-be21-58ad781dd616" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157526 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6646cc-68b8-4672-be21-58ad781dd616" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.157547 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-api" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157553 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-api" Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.157570 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2e276b-e6c3-4302-a9d8-b63830394431" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157576 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2e276b-e6c3-4302-a9d8-b63830394431" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157761 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2e276b-e6c3-4302-a9d8-b63830394431" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157784 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6646cc-68b8-4672-be21-58ad781dd616" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157796 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-api" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157813 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="853263fd-fa07-43e9-9855-fc057772d052" containerName="neutron-httpd" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.157829 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="925dde31-bb8c-4306-9dc9-5a7119e33f4e" containerName="mariadb-database-create" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.158690 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2c40-account-create-4hc7f" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.160851 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.169230 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2c40-account-create-4hc7f"] Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.182139 4676 generic.go:334] "Generic (PLEG): container finished" podID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerID="619024e3a34effa642e1116e56a9ab51b73a34e02409595f29e6205fddccb644" exitCode=0 Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.182188 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerDied","Data":"619024e3a34effa642e1116e56a9ab51b73a34e02409595f29e6205fddccb644"} Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.182222 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8221ffd4-6931-416c-abfe-bf0a4c441b7a","Type":"ContainerDied","Data":"bfbbebafbd3572af42e7ab27d90f826f60ac5a659b1bbd3d481448239982f842"} Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.182235 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfbbebafbd3572af42e7ab27d90f826f60ac5a659b1bbd3d481448239982f842" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.205245 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242100 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-log-httpd\") pod \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242199 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-sg-core-conf-yaml\") pod \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242292 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-run-httpd\") pod \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242312 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-combined-ca-bundle\") pod \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242376 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9fn9\" (UniqueName: \"kubernetes.io/projected/8221ffd4-6931-416c-abfe-bf0a4c441b7a-kube-api-access-b9fn9\") pod \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242420 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-scripts\") pod \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242499 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-config-data\") pod \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\" (UID: \"8221ffd4-6931-416c-abfe-bf0a4c441b7a\") " Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.242842 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psptv\" (UniqueName: \"kubernetes.io/projected/8b49237c-6903-4b6d-b833-4cebfa620ffd-kube-api-access-psptv\") pod \"nova-api-2c40-account-create-4hc7f\" (UID: \"8b49237c-6903-4b6d-b833-4cebfa620ffd\") " pod="openstack/nova-api-2c40-account-create-4hc7f" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.243698 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8221ffd4-6931-416c-abfe-bf0a4c441b7a" (UID: "8221ffd4-6931-416c-abfe-bf0a4c441b7a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.244791 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8221ffd4-6931-416c-abfe-bf0a4c441b7a" (UID: "8221ffd4-6931-416c-abfe-bf0a4c441b7a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.248796 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8221ffd4-6931-416c-abfe-bf0a4c441b7a-kube-api-access-b9fn9" (OuterVolumeSpecName: "kube-api-access-b9fn9") pod "8221ffd4-6931-416c-abfe-bf0a4c441b7a" (UID: "8221ffd4-6931-416c-abfe-bf0a4c441b7a"). InnerVolumeSpecName "kube-api-access-b9fn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.259099 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-scripts" (OuterVolumeSpecName: "scripts") pod "8221ffd4-6931-416c-abfe-bf0a4c441b7a" (UID: "8221ffd4-6931-416c-abfe-bf0a4c441b7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.276520 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8221ffd4-6931-416c-abfe-bf0a4c441b7a" (UID: "8221ffd4-6931-416c-abfe-bf0a4c441b7a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.333837 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-3fd5-account-create-sgtfn"] Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.334367 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="sg-core" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334391 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="sg-core" Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.334412 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-central-agent" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334419 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-central-agent" Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.334428 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-notification-agent" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334433 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-notification-agent" Dec 04 15:41:36 crc kubenswrapper[4676]: E1204 15:41:36.334461 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="proxy-httpd" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334467 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="proxy-httpd" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334864 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-central-agent" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334887 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="proxy-httpd" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334916 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="ceilometer-notification-agent" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.334929 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" containerName="sg-core" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.335831 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3fd5-account-create-sgtfn" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.344762 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.345852 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psptv\" (UniqueName: \"kubernetes.io/projected/8b49237c-6903-4b6d-b833-4cebfa620ffd-kube-api-access-psptv\") pod \"nova-api-2c40-account-create-4hc7f\" (UID: \"8b49237c-6903-4b6d-b833-4cebfa620ffd\") " pod="openstack/nova-api-2c40-account-create-4hc7f" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.345960 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68st\" (UniqueName: \"kubernetes.io/projected/6b817005-97d2-4e1c-9363-15d8d0810d35-kube-api-access-z68st\") pod \"nova-cell0-3fd5-account-create-sgtfn\" (UID: \"6b817005-97d2-4e1c-9363-15d8d0810d35\") " pod="openstack/nova-cell0-3fd5-account-create-sgtfn" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.346172 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.346216 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.346229 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8221ffd4-6931-416c-abfe-bf0a4c441b7a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.346244 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9fn9\" (UniqueName: \"kubernetes.io/projected/8221ffd4-6931-416c-abfe-bf0a4c441b7a-kube-api-access-b9fn9\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.346257 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.365887 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8221ffd4-6931-416c-abfe-bf0a4c441b7a" (UID: "8221ffd4-6931-416c-abfe-bf0a4c441b7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.371110 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3fd5-account-create-sgtfn"] Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.380831 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psptv\" (UniqueName: \"kubernetes.io/projected/8b49237c-6903-4b6d-b833-4cebfa620ffd-kube-api-access-psptv\") pod \"nova-api-2c40-account-create-4hc7f\" (UID: \"8b49237c-6903-4b6d-b833-4cebfa620ffd\") " pod="openstack/nova-api-2c40-account-create-4hc7f" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.382510 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-config-data" (OuterVolumeSpecName: "config-data") pod "8221ffd4-6931-416c-abfe-bf0a4c441b7a" (UID: "8221ffd4-6931-416c-abfe-bf0a4c441b7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.447789 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z68st\" (UniqueName: \"kubernetes.io/projected/6b817005-97d2-4e1c-9363-15d8d0810d35-kube-api-access-z68st\") pod \"nova-cell0-3fd5-account-create-sgtfn\" (UID: \"6b817005-97d2-4e1c-9363-15d8d0810d35\") " pod="openstack/nova-cell0-3fd5-account-create-sgtfn" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.448036 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.448057 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8221ffd4-6931-416c-abfe-bf0a4c441b7a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.467514 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z68st\" (UniqueName: \"kubernetes.io/projected/6b817005-97d2-4e1c-9363-15d8d0810d35-kube-api-access-z68st\") pod \"nova-cell0-3fd5-account-create-sgtfn\" (UID: \"6b817005-97d2-4e1c-9363-15d8d0810d35\") " pod="openstack/nova-cell0-3fd5-account-create-sgtfn" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.516009 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2c40-account-create-4hc7f" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.529180 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7e73-account-create-57czj"] Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.531381 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e73-account-create-57czj" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.533967 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.542642 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7e73-account-create-57czj"] Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.551695 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh9xn\" (UniqueName: \"kubernetes.io/projected/4206086a-944c-4c86-8e9c-1b4c9272c70d-kube-api-access-mh9xn\") pod \"nova-cell1-7e73-account-create-57czj\" (UID: \"4206086a-944c-4c86-8e9c-1b4c9272c70d\") " pod="openstack/nova-cell1-7e73-account-create-57czj" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.656363 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh9xn\" (UniqueName: \"kubernetes.io/projected/4206086a-944c-4c86-8e9c-1b4c9272c70d-kube-api-access-mh9xn\") pod \"nova-cell1-7e73-account-create-57czj\" (UID: \"4206086a-944c-4c86-8e9c-1b4c9272c70d\") " pod="openstack/nova-cell1-7e73-account-create-57czj" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.674713 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3fd5-account-create-sgtfn" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.675811 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh9xn\" (UniqueName: \"kubernetes.io/projected/4206086a-944c-4c86-8e9c-1b4c9272c70d-kube-api-access-mh9xn\") pod \"nova-cell1-7e73-account-create-57czj\" (UID: \"4206086a-944c-4c86-8e9c-1b4c9272c70d\") " pod="openstack/nova-cell1-7e73-account-create-57czj" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.857965 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e73-account-create-57czj" Dec 04 15:41:36 crc kubenswrapper[4676]: I1204 15:41:36.980135 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2c40-account-create-4hc7f"] Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.274135 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-3fd5-account-create-sgtfn"] Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.287348 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.288799 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2c40-account-create-4hc7f" event={"ID":"8b49237c-6903-4b6d-b833-4cebfa620ffd","Type":"ContainerStarted","Data":"5f3a8489532c54e78dc44d888f481e85976ab5ce093cf568d7178247748b8553"} Dec 04 15:41:37 crc kubenswrapper[4676]: W1204 15:41:37.317733 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4206086a_944c_4c86_8e9c_1b4c9272c70d.slice/crio-c504cd123f72749714c18d691b2d53f8858a7fc160a0e39496b727181b4af5c5 WatchSource:0}: Error finding container c504cd123f72749714c18d691b2d53f8858a7fc160a0e39496b727181b4af5c5: Status 404 returned error can't find the container with id c504cd123f72749714c18d691b2d53f8858a7fc160a0e39496b727181b4af5c5 Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.318021 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7e73-account-create-57czj"] Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.364265 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.379678 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.422825 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8221ffd4-6931-416c-abfe-bf0a4c441b7a" path="/var/lib/kubelet/pods/8221ffd4-6931-416c-abfe-bf0a4c441b7a/volumes" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.423826 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.430162 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.430268 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.439068 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.439198 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.558356 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.558437 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-run-httpd\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.558490 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-log-httpd\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.558514 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-scripts\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.558710 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hlk\" (UniqueName: \"kubernetes.io/projected/4adc98a9-0a54-437f-a041-0a4a1f6deac9-kube-api-access-l6hlk\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.558763 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-config-data\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.558827 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: E1204 15:41:37.629766 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8221ffd4_6931_416c_abfe_bf0a4c441b7a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b49237c_6903_4b6d_b833_4cebfa620ffd.slice/crio-conmon-ef29697bd1b0e438405594ba443129f4c775c39c3b663c74a5dcf8386f71f4a5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8221ffd4_6931_416c_abfe_bf0a4c441b7a.slice/crio-bfbbebafbd3572af42e7ab27d90f826f60ac5a659b1bbd3d481448239982f842\": RecentStats: unable to find data in memory cache]" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.661681 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hlk\" (UniqueName: \"kubernetes.io/projected/4adc98a9-0a54-437f-a041-0a4a1f6deac9-kube-api-access-l6hlk\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.661796 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-config-data\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.661890 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.661947 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.661992 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-run-httpd\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.662032 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-log-httpd\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.662063 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-scripts\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.664658 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-run-httpd\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.664715 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-log-httpd\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.669658 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.669725 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-config-data\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.670163 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.670888 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-scripts\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.683427 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hlk\" (UniqueName: \"kubernetes.io/projected/4adc98a9-0a54-437f-a041-0a4a1f6deac9-kube-api-access-l6hlk\") pod \"ceilometer-0\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " pod="openstack/ceilometer-0" Dec 04 15:41:37 crc kubenswrapper[4676]: I1204 15:41:37.754120 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.228348 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:41:38 crc kubenswrapper[4676]: W1204 15:41:38.235016 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4adc98a9_0a54_437f_a041_0a4a1f6deac9.slice/crio-e8a1f23ea92929ddcb7999aa5c90e22eb52f3d549b0054a8f22a6e2a2bc892f9 WatchSource:0}: Error finding container e8a1f23ea92929ddcb7999aa5c90e22eb52f3d549b0054a8f22a6e2a2bc892f9: Status 404 returned error can't find the container with id e8a1f23ea92929ddcb7999aa5c90e22eb52f3d549b0054a8f22a6e2a2bc892f9 Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.304599 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerStarted","Data":"e8a1f23ea92929ddcb7999aa5c90e22eb52f3d549b0054a8f22a6e2a2bc892f9"} Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.306435 4676 generic.go:334] "Generic (PLEG): container finished" podID="4206086a-944c-4c86-8e9c-1b4c9272c70d" containerID="19d298dea519a8bf7a1508e45ee4948d1ed84a75b7913e584e1a03a075c3d376" exitCode=0 Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.306504 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e73-account-create-57czj" event={"ID":"4206086a-944c-4c86-8e9c-1b4c9272c70d","Type":"ContainerDied","Data":"19d298dea519a8bf7a1508e45ee4948d1ed84a75b7913e584e1a03a075c3d376"} Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.306521 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e73-account-create-57czj" event={"ID":"4206086a-944c-4c86-8e9c-1b4c9272c70d","Type":"ContainerStarted","Data":"c504cd123f72749714c18d691b2d53f8858a7fc160a0e39496b727181b4af5c5"} Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.309426 4676 generic.go:334] "Generic (PLEG): container finished" podID="8b49237c-6903-4b6d-b833-4cebfa620ffd" containerID="ef29697bd1b0e438405594ba443129f4c775c39c3b663c74a5dcf8386f71f4a5" exitCode=0 Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.309496 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2c40-account-create-4hc7f" event={"ID":"8b49237c-6903-4b6d-b833-4cebfa620ffd","Type":"ContainerDied","Data":"ef29697bd1b0e438405594ba443129f4c775c39c3b663c74a5dcf8386f71f4a5"} Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.312170 4676 generic.go:334] "Generic (PLEG): container finished" podID="6b817005-97d2-4e1c-9363-15d8d0810d35" containerID="c659b482b9b8fc05646417b89b5d822f64ffc5723b9f82533e46a3abb4b09cde" exitCode=0 Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.312230 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3fd5-account-create-sgtfn" event={"ID":"6b817005-97d2-4e1c-9363-15d8d0810d35","Type":"ContainerDied","Data":"c659b482b9b8fc05646417b89b5d822f64ffc5723b9f82533e46a3abb4b09cde"} Dec 04 15:41:38 crc kubenswrapper[4676]: I1204 15:41:38.312263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3fd5-account-create-sgtfn" event={"ID":"6b817005-97d2-4e1c-9363-15d8d0810d35","Type":"ContainerStarted","Data":"b5341fd6a6f6d97e00ba7a45e449505b6204484d1703f4b33a3f80aa73eac94b"} Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.315055 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.336847 4676 generic.go:334] "Generic (PLEG): container finished" podID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerID="1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3" exitCode=0 Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.337038 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.337879 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerDied","Data":"1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3"} Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.338114 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a","Type":"ContainerDied","Data":"904535432fc47f7b1b1d3c4610189de6b90c1e1c083943416a48bcc79a1e46d1"} Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.338143 4676 scope.go:117] "RemoveContainer" containerID="1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.348741 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerStarted","Data":"2e979b0c5efea0d106eeab479a29cdf27cb8207e647fe27b6904c5ca051cd793"} Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.348836 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerStarted","Data":"073b35788d7b55e6a297304e7a91f4550c1b34b3d45da7c1786c64c403eae7ef"} Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.405795 4676 scope.go:117] "RemoveContainer" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.458177 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-combined-ca-bundle\") pod \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.458258 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-custom-prometheus-ca\") pod \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.459112 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfptw\" (UniqueName: \"kubernetes.io/projected/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-kube-api-access-bfptw\") pod \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.459171 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-config-data\") pod \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.459191 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-logs\") pod \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\" (UID: \"aeea1eb2-6952-4bef-a6f3-7dd8636ff74a\") " Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.464702 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-logs" (OuterVolumeSpecName: "logs") pod "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" (UID: "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.467888 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-kube-api-access-bfptw" (OuterVolumeSpecName: "kube-api-access-bfptw") pod "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" (UID: "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a"). InnerVolumeSpecName "kube-api-access-bfptw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.503843 4676 scope.go:117] "RemoveContainer" containerID="1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3" Dec 04 15:41:39 crc kubenswrapper[4676]: E1204 15:41:39.514995 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3\": container with ID starting with 1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3 not found: ID does not exist" containerID="1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.515009 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" (UID: "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.515053 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3"} err="failed to get container status \"1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3\": rpc error: code = NotFound desc = could not find container \"1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3\": container with ID starting with 1c872684ce347a5b668bbde386f73f83c110f519e941e77e89904128422e34c3 not found: ID does not exist" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.515092 4676 scope.go:117] "RemoveContainer" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:41:39 crc kubenswrapper[4676]: E1204 15:41:39.517093 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08\": container with ID starting with 06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08 not found: ID does not exist" containerID="06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.517144 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08"} err="failed to get container status \"06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08\": rpc error: code = NotFound desc = could not find container \"06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08\": container with ID starting with 06f5fced8f594d1a54e402cf3d33889e3095b4114404e80e0cab5b7a81d4ee08 not found: ID does not exist" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.559728 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-config-data" (OuterVolumeSpecName: "config-data") pod "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" (UID: "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.562384 4676 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.562520 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfptw\" (UniqueName: \"kubernetes.io/projected/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-kube-api-access-bfptw\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.563320 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.563420 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.579077 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" (UID: "aeea1eb2-6952-4bef-a6f3-7dd8636ff74a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.667050 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.684988 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.716983 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.733454 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:41:39 crc kubenswrapper[4676]: E1204 15:41:39.734409 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734436 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: E1204 15:41:39.734452 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734462 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: E1204 15:41:39.734480 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734486 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: E1204 15:41:39.734498 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734504 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: E1204 15:41:39.734513 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734520 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734754 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734768 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.734780 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.735716 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.739233 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.744704 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3fd5-account-create-sgtfn" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.749636 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.888191 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z68st\" (UniqueName: \"kubernetes.io/projected/6b817005-97d2-4e1c-9363-15d8d0810d35-kube-api-access-z68st\") pod \"6b817005-97d2-4e1c-9363-15d8d0810d35\" (UID: \"6b817005-97d2-4e1c-9363-15d8d0810d35\") " Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.889439 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxkwp\" (UniqueName: \"kubernetes.io/projected/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-kube-api-access-nxkwp\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.889484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-config-data\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.889511 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.889537 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-logs\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.889695 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.894471 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b817005-97d2-4e1c-9363-15d8d0810d35-kube-api-access-z68st" (OuterVolumeSpecName: "kube-api-access-z68st") pod "6b817005-97d2-4e1c-9363-15d8d0810d35" (UID: "6b817005-97d2-4e1c-9363-15d8d0810d35"). InnerVolumeSpecName "kube-api-access-z68st". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.991463 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxkwp\" (UniqueName: \"kubernetes.io/projected/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-kube-api-access-nxkwp\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.991519 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-config-data\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.991542 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.991566 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-logs\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.991670 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.991800 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z68st\" (UniqueName: \"kubernetes.io/projected/6b817005-97d2-4e1c-9363-15d8d0810d35-kube-api-access-z68st\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.992290 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-logs\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.995243 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.996468 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-config-data\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:39 crc kubenswrapper[4676]: I1204 15:41:39.996676 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.011504 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxkwp\" (UniqueName: \"kubernetes.io/projected/d97e77cc-3e3e-4d05-b57e-b87782f3ada8-kube-api-access-nxkwp\") pod \"watcher-decision-engine-0\" (UID: \"d97e77cc-3e3e-4d05-b57e-b87782f3ada8\") " pod="openstack/watcher-decision-engine-0" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.089832 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.101784 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e73-account-create-57czj" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.109266 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2c40-account-create-4hc7f" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.199758 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh9xn\" (UniqueName: \"kubernetes.io/projected/4206086a-944c-4c86-8e9c-1b4c9272c70d-kube-api-access-mh9xn\") pod \"4206086a-944c-4c86-8e9c-1b4c9272c70d\" (UID: \"4206086a-944c-4c86-8e9c-1b4c9272c70d\") " Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.212573 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4206086a-944c-4c86-8e9c-1b4c9272c70d-kube-api-access-mh9xn" (OuterVolumeSpecName: "kube-api-access-mh9xn") pod "4206086a-944c-4c86-8e9c-1b4c9272c70d" (UID: "4206086a-944c-4c86-8e9c-1b4c9272c70d"). InnerVolumeSpecName "kube-api-access-mh9xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.305169 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psptv\" (UniqueName: \"kubernetes.io/projected/8b49237c-6903-4b6d-b833-4cebfa620ffd-kube-api-access-psptv\") pod \"8b49237c-6903-4b6d-b833-4cebfa620ffd\" (UID: \"8b49237c-6903-4b6d-b833-4cebfa620ffd\") " Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.305810 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh9xn\" (UniqueName: \"kubernetes.io/projected/4206086a-944c-4c86-8e9c-1b4c9272c70d-kube-api-access-mh9xn\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.312437 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b49237c-6903-4b6d-b833-4cebfa620ffd-kube-api-access-psptv" (OuterVolumeSpecName: "kube-api-access-psptv") pod "8b49237c-6903-4b6d-b833-4cebfa620ffd" (UID: "8b49237c-6903-4b6d-b833-4cebfa620ffd"). InnerVolumeSpecName "kube-api-access-psptv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.396870 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerStarted","Data":"f2dac21b1543ca0262f785ff09ffe65b3f518d65e1c3b1142c67f9d294414482"} Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.403240 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7e73-account-create-57czj" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.403243 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7e73-account-create-57czj" event={"ID":"4206086a-944c-4c86-8e9c-1b4c9272c70d","Type":"ContainerDied","Data":"c504cd123f72749714c18d691b2d53f8858a7fc160a0e39496b727181b4af5c5"} Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.403304 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c504cd123f72749714c18d691b2d53f8858a7fc160a0e39496b727181b4af5c5" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.414340 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2c40-account-create-4hc7f" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.414522 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2c40-account-create-4hc7f" event={"ID":"8b49237c-6903-4b6d-b833-4cebfa620ffd","Type":"ContainerDied","Data":"5f3a8489532c54e78dc44d888f481e85976ab5ce093cf568d7178247748b8553"} Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.414768 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f3a8489532c54e78dc44d888f481e85976ab5ce093cf568d7178247748b8553" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.416725 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psptv\" (UniqueName: \"kubernetes.io/projected/8b49237c-6903-4b6d-b833-4cebfa620ffd-kube-api-access-psptv\") on node \"crc\" DevicePath \"\"" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.421107 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-3fd5-account-create-sgtfn" event={"ID":"6b817005-97d2-4e1c-9363-15d8d0810d35","Type":"ContainerDied","Data":"b5341fd6a6f6d97e00ba7a45e449505b6204484d1703f4b33a3f80aa73eac94b"} Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.421149 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5341fd6a6f6d97e00ba7a45e449505b6204484d1703f4b33a3f80aa73eac94b" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.421559 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-3fd5-account-create-sgtfn" Dec 04 15:41:40 crc kubenswrapper[4676]: I1204 15:41:40.651294 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.400670 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" path="/var/lib/kubelet/pods/aeea1eb2-6952-4bef-a6f3-7dd8636ff74a/volumes" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.438879 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"d97e77cc-3e3e-4d05-b57e-b87782f3ada8","Type":"ContainerStarted","Data":"87393fef7a9b600c73e44927605ef6f94ce4f28c13f0eb23ec124c8b0da765f3"} Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.438957 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"d97e77cc-3e3e-4d05-b57e-b87782f3ada8","Type":"ContainerStarted","Data":"2174ddd541361fd8bfb4fb2b0570c05a6b8df97f86a22ed61a49b26bfd35b323"} Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.470356 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.470333556 podStartE2EDuration="2.470333556s" podCreationTimestamp="2025-12-04 15:41:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:41:41.461021336 +0000 UTC m=+1308.895691193" watchObservedRunningTime="2025-12-04 15:41:41.470333556 +0000 UTC m=+1308.905003413" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.671540 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lhqd6"] Dec 04 15:41:41 crc kubenswrapper[4676]: E1204 15:41:41.671998 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4206086a-944c-4c86-8e9c-1b4c9272c70d" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.672015 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4206086a-944c-4c86-8e9c-1b4c9272c70d" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: E1204 15:41:41.672039 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b817005-97d2-4e1c-9363-15d8d0810d35" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.672046 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b817005-97d2-4e1c-9363-15d8d0810d35" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: E1204 15:41:41.672064 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b49237c-6903-4b6d-b833-4cebfa620ffd" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.672072 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b49237c-6903-4b6d-b833-4cebfa620ffd" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.672286 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b817005-97d2-4e1c-9363-15d8d0810d35" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.672311 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b49237c-6903-4b6d-b833-4cebfa620ffd" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.672321 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4206086a-944c-4c86-8e9c-1b4c9272c70d" containerName="mariadb-account-create" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.672334 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.678615 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.681613 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lhqd6"] Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.683268 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.683439 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xkwhn" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.683703 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.836988 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98rhn\" (UniqueName: \"kubernetes.io/projected/9dea9144-3173-4ad8-ab2a-d44cd0215507-kube-api-access-98rhn\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.837378 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.837473 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-scripts\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.837509 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-config-data\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.939558 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98rhn\" (UniqueName: \"kubernetes.io/projected/9dea9144-3173-4ad8-ab2a-d44cd0215507-kube-api-access-98rhn\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.939646 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.939748 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-scripts\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.939782 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-config-data\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.950575 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-scripts\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.959646 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-config-data\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.965653 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98rhn\" (UniqueName: \"kubernetes.io/projected/9dea9144-3173-4ad8-ab2a-d44cd0215507-kube-api-access-98rhn\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:41 crc kubenswrapper[4676]: I1204 15:41:41.966566 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lhqd6\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:42 crc kubenswrapper[4676]: I1204 15:41:42.095445 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:41:42 crc kubenswrapper[4676]: I1204 15:41:42.453165 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerStarted","Data":"071146b26b81472b41f05d348de7e09f7d76f620ca9b622c2cbf0ea683cb57df"} Dec 04 15:41:42 crc kubenswrapper[4676]: I1204 15:41:42.453424 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 15:41:42 crc kubenswrapper[4676]: I1204 15:41:42.480127 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.176931716 podStartE2EDuration="5.480106846s" podCreationTimestamp="2025-12-04 15:41:37 +0000 UTC" firstStartedPulling="2025-12-04 15:41:38.237919038 +0000 UTC m=+1305.672588885" lastFinishedPulling="2025-12-04 15:41:41.541094158 +0000 UTC m=+1308.975764015" observedRunningTime="2025-12-04 15:41:42.472581558 +0000 UTC m=+1309.907251415" watchObservedRunningTime="2025-12-04 15:41:42.480106846 +0000 UTC m=+1309.914776703" Dec 04 15:41:42 crc kubenswrapper[4676]: I1204 15:41:42.619830 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lhqd6"] Dec 04 15:41:43 crc kubenswrapper[4676]: I1204 15:41:43.491945 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" event={"ID":"9dea9144-3173-4ad8-ab2a-d44cd0215507","Type":"ContainerStarted","Data":"3145063ecbeb0318964b484420ef1bf714954a109bf1078ed4c11fbb491869ac"} Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.026745 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.027142 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.027205 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.028134 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4ed31aaa37dc8e9548191807986356b721b0f7ff822299d24779fcd58f9d4ea2"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.028203 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://4ed31aaa37dc8e9548191807986356b721b0f7ff822299d24779fcd58f9d4ea2" gracePeriod=600 Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.547324 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="4ed31aaa37dc8e9548191807986356b721b0f7ff822299d24779fcd58f9d4ea2" exitCode=0 Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.547361 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"4ed31aaa37dc8e9548191807986356b721b0f7ff822299d24779fcd58f9d4ea2"} Dec 04 15:41:46 crc kubenswrapper[4676]: I1204 15:41:46.547633 4676 scope.go:117] "RemoveContainer" containerID="47374e6ac332c7bd6c641b2efeca6385b181e71dff18cb42d3770eabc6e1122b" Dec 04 15:41:50 crc kubenswrapper[4676]: I1204 15:41:50.090878 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:50 crc kubenswrapper[4676]: I1204 15:41:50.122591 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:50 crc kubenswrapper[4676]: I1204 15:41:50.587616 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:50 crc kubenswrapper[4676]: I1204 15:41:50.625993 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Dec 04 15:41:51 crc kubenswrapper[4676]: I1204 15:41:51.605755 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" event={"ID":"9dea9144-3173-4ad8-ab2a-d44cd0215507","Type":"ContainerStarted","Data":"349c81e6cadc8dbee6c218cb89424578435b0f2e4e587da0b9a04a2e0fc8eeb3"} Dec 04 15:41:51 crc kubenswrapper[4676]: I1204 15:41:51.618347 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994"} Dec 04 15:41:51 crc kubenswrapper[4676]: I1204 15:41:51.682963 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" podStartSLOduration=2.188194439 podStartE2EDuration="10.682939296s" podCreationTimestamp="2025-12-04 15:41:41 +0000 UTC" firstStartedPulling="2025-12-04 15:41:42.617000975 +0000 UTC m=+1310.051670832" lastFinishedPulling="2025-12-04 15:41:51.111745832 +0000 UTC m=+1318.546415689" observedRunningTime="2025-12-04 15:41:51.633238895 +0000 UTC m=+1319.067908762" watchObservedRunningTime="2025-12-04 15:41:51.682939296 +0000 UTC m=+1319.117609143" Dec 04 15:42:04 crc kubenswrapper[4676]: I1204 15:42:04.751055 4676 generic.go:334] "Generic (PLEG): container finished" podID="9dea9144-3173-4ad8-ab2a-d44cd0215507" containerID="349c81e6cadc8dbee6c218cb89424578435b0f2e4e587da0b9a04a2e0fc8eeb3" exitCode=0 Dec 04 15:42:04 crc kubenswrapper[4676]: I1204 15:42:04.751146 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" event={"ID":"9dea9144-3173-4ad8-ab2a-d44cd0215507","Type":"ContainerDied","Data":"349c81e6cadc8dbee6c218cb89424578435b0f2e4e587da0b9a04a2e0fc8eeb3"} Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.098573 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.229103 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-config-data\") pod \"9dea9144-3173-4ad8-ab2a-d44cd0215507\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.229159 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-scripts\") pod \"9dea9144-3173-4ad8-ab2a-d44cd0215507\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.229267 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-combined-ca-bundle\") pod \"9dea9144-3173-4ad8-ab2a-d44cd0215507\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.229321 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98rhn\" (UniqueName: \"kubernetes.io/projected/9dea9144-3173-4ad8-ab2a-d44cd0215507-kube-api-access-98rhn\") pod \"9dea9144-3173-4ad8-ab2a-d44cd0215507\" (UID: \"9dea9144-3173-4ad8-ab2a-d44cd0215507\") " Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.235090 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dea9144-3173-4ad8-ab2a-d44cd0215507-kube-api-access-98rhn" (OuterVolumeSpecName: "kube-api-access-98rhn") pod "9dea9144-3173-4ad8-ab2a-d44cd0215507" (UID: "9dea9144-3173-4ad8-ab2a-d44cd0215507"). InnerVolumeSpecName "kube-api-access-98rhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.235211 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-scripts" (OuterVolumeSpecName: "scripts") pod "9dea9144-3173-4ad8-ab2a-d44cd0215507" (UID: "9dea9144-3173-4ad8-ab2a-d44cd0215507"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.258366 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-config-data" (OuterVolumeSpecName: "config-data") pod "9dea9144-3173-4ad8-ab2a-d44cd0215507" (UID: "9dea9144-3173-4ad8-ab2a-d44cd0215507"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.260382 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dea9144-3173-4ad8-ab2a-d44cd0215507" (UID: "9dea9144-3173-4ad8-ab2a-d44cd0215507"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.332188 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.332409 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98rhn\" (UniqueName: \"kubernetes.io/projected/9dea9144-3173-4ad8-ab2a-d44cd0215507-kube-api-access-98rhn\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.332504 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.332563 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9dea9144-3173-4ad8-ab2a-d44cd0215507-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.777185 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" event={"ID":"9dea9144-3173-4ad8-ab2a-d44cd0215507","Type":"ContainerDied","Data":"3145063ecbeb0318964b484420ef1bf714954a109bf1078ed4c11fbb491869ac"} Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.777239 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3145063ecbeb0318964b484420ef1bf714954a109bf1078ed4c11fbb491869ac" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.777321 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lhqd6" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.875217 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 15:42:06 crc kubenswrapper[4676]: E1204 15:42:06.875871 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dea9144-3173-4ad8-ab2a-d44cd0215507" containerName="nova-cell0-conductor-db-sync" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.875897 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dea9144-3173-4ad8-ab2a-d44cd0215507" containerName="nova-cell0-conductor-db-sync" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.876142 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dea9144-3173-4ad8-ab2a-d44cd0215507" containerName="nova-cell0-conductor-db-sync" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.876163 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeea1eb2-6952-4bef-a6f3-7dd8636ff74a" containerName="watcher-decision-engine" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.876892 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.882633 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xkwhn" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.882686 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 04 15:42:06 crc kubenswrapper[4676]: I1204 15:42:06.885387 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.046321 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj4nh\" (UniqueName: \"kubernetes.io/projected/341ba99e-36fa-4121-978a-de87bfd92b85-kube-api-access-sj4nh\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.046674 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341ba99e-36fa-4121-978a-de87bfd92b85-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.047072 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341ba99e-36fa-4121-978a-de87bfd92b85-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.148983 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj4nh\" (UniqueName: \"kubernetes.io/projected/341ba99e-36fa-4121-978a-de87bfd92b85-kube-api-access-sj4nh\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.149084 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341ba99e-36fa-4121-978a-de87bfd92b85-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.149218 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341ba99e-36fa-4121-978a-de87bfd92b85-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.155520 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/341ba99e-36fa-4121-978a-de87bfd92b85-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.156878 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/341ba99e-36fa-4121-978a-de87bfd92b85-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.171241 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj4nh\" (UniqueName: \"kubernetes.io/projected/341ba99e-36fa-4121-978a-de87bfd92b85-kube-api-access-sj4nh\") pod \"nova-cell0-conductor-0\" (UID: \"341ba99e-36fa-4121-978a-de87bfd92b85\") " pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.201874 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.630631 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.761973 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 15:42:07 crc kubenswrapper[4676]: I1204 15:42:07.788080 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"341ba99e-36fa-4121-978a-de87bfd92b85","Type":"ContainerStarted","Data":"9a20498e6e7f3efdf5a11d144ab901c4e9d749c3798108996cc347d2d7497489"} Dec 04 15:42:08 crc kubenswrapper[4676]: I1204 15:42:08.799985 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"341ba99e-36fa-4121-978a-de87bfd92b85","Type":"ContainerStarted","Data":"ed2089361650544f8887dfc4bc4f3f299093b7da9fff5fa458a96342b7178d41"} Dec 04 15:42:08 crc kubenswrapper[4676]: I1204 15:42:08.800492 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:11 crc kubenswrapper[4676]: I1204 15:42:11.491884 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=5.491865072 podStartE2EDuration="5.491865072s" podCreationTimestamp="2025-12-04 15:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:08.818108883 +0000 UTC m=+1336.252778740" watchObservedRunningTime="2025-12-04 15:42:11.491865072 +0000 UTC m=+1338.926534919" Dec 04 15:42:11 crc kubenswrapper[4676]: I1204 15:42:11.494004 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:42:11 crc kubenswrapper[4676]: I1204 15:42:11.494234 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="ea978af1-b6d8-490b-8bfd-6b2ec699f47f" containerName="kube-state-metrics" containerID="cri-o://1d295816b0c149b0ed9563d54bf078c7551d06e7304c2510bbb73404b846decd" gracePeriod=30 Dec 04 15:42:11 crc kubenswrapper[4676]: I1204 15:42:11.839689 4676 generic.go:334] "Generic (PLEG): container finished" podID="ea978af1-b6d8-490b-8bfd-6b2ec699f47f" containerID="1d295816b0c149b0ed9563d54bf078c7551d06e7304c2510bbb73404b846decd" exitCode=2 Dec 04 15:42:11 crc kubenswrapper[4676]: I1204 15:42:11.840024 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ea978af1-b6d8-490b-8bfd-6b2ec699f47f","Type":"ContainerDied","Data":"1d295816b0c149b0ed9563d54bf078c7551d06e7304c2510bbb73404b846decd"} Dec 04 15:42:11 crc kubenswrapper[4676]: I1204 15:42:11.971156 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.142003 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrjkf\" (UniqueName: \"kubernetes.io/projected/ea978af1-b6d8-490b-8bfd-6b2ec699f47f-kube-api-access-wrjkf\") pod \"ea978af1-b6d8-490b-8bfd-6b2ec699f47f\" (UID: \"ea978af1-b6d8-490b-8bfd-6b2ec699f47f\") " Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.147743 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea978af1-b6d8-490b-8bfd-6b2ec699f47f-kube-api-access-wrjkf" (OuterVolumeSpecName: "kube-api-access-wrjkf") pod "ea978af1-b6d8-490b-8bfd-6b2ec699f47f" (UID: "ea978af1-b6d8-490b-8bfd-6b2ec699f47f"). InnerVolumeSpecName "kube-api-access-wrjkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.235030 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.244266 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrjkf\" (UniqueName: \"kubernetes.io/projected/ea978af1-b6d8-490b-8bfd-6b2ec699f47f-kube-api-access-wrjkf\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.741076 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mvkng"] Dec 04 15:42:12 crc kubenswrapper[4676]: E1204 15:42:12.741527 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea978af1-b6d8-490b-8bfd-6b2ec699f47f" containerName="kube-state-metrics" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.741541 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea978af1-b6d8-490b-8bfd-6b2ec699f47f" containerName="kube-state-metrics" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.741741 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea978af1-b6d8-490b-8bfd-6b2ec699f47f" containerName="kube-state-metrics" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.742386 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.744233 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.744526 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.757869 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvkng"] Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.853444 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"ea978af1-b6d8-490b-8bfd-6b2ec699f47f","Type":"ContainerDied","Data":"2042bee0f9f93f4842370f0136ee9d3ff847165421fdf33cfc079a1e57d94e44"} Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.853504 4676 scope.go:117] "RemoveContainer" containerID="1d295816b0c149b0ed9563d54bf078c7551d06e7304c2510bbb73404b846decd" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.853671 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.855717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-config-data\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.855808 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgrzk\" (UniqueName: \"kubernetes.io/projected/654b6ea4-eb07-4074-a7ba-d743b87f6489-kube-api-access-vgrzk\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.855857 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-scripts\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.855933 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.892335 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.894418 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.898296 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.906573 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.929982 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.952339 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.958184 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-config-data\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.958300 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgrzk\" (UniqueName: \"kubernetes.io/projected/654b6ea4-eb07-4074-a7ba-d743b87f6489-kube-api-access-vgrzk\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.958355 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-scripts\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.958418 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.970079 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.970167 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.971975 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.975587 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-config-data\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.978400 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 15:42:12 crc kubenswrapper[4676]: I1204 15:42:12.986180 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-scripts\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.002646 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgrzk\" (UniqueName: \"kubernetes.io/projected/654b6ea4-eb07-4074-a7ba-d743b87f6489-kube-api-access-vgrzk\") pod \"nova-cell0-cell-mapping-mvkng\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.057985 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.060487 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vft4l\" (UniqueName: \"kubernetes.io/projected/09e1cddd-f35d-4e93-9331-429675aa4275-kube-api-access-vft4l\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.060555 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-config-data\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.060591 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.060607 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hjq7\" (UniqueName: \"kubernetes.io/projected/8468d903-d218-42b8-9621-6ec64ee2a7f9-kube-api-access-5hjq7\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.060645 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-config-data\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.060674 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8468d903-d218-42b8-9621-6ec64ee2a7f9-logs\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.060699 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.061322 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.074204 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.075806 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.083308 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.083574 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.097277 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.122490 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.124583 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.128019 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.154295 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.155779 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.157997 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165601 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vft4l\" (UniqueName: \"kubernetes.io/projected/09e1cddd-f35d-4e93-9331-429675aa4275-kube-api-access-vft4l\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165667 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165716 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-config-data\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165748 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165764 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hjq7\" (UniqueName: \"kubernetes.io/projected/8468d903-d218-42b8-9621-6ec64ee2a7f9-kube-api-access-5hjq7\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165798 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165821 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-config-data\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165847 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8468d903-d218-42b8-9621-6ec64ee2a7f9-logs\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165869 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165946 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdgw4\" (UniqueName: \"kubernetes.io/projected/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-api-access-kdgw4\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.165981 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.174887 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8468d903-d218-42b8-9621-6ec64ee2a7f9-logs\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.174968 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.184136 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.188267 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.192957 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.195713 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-config-data\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.199090 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-config-data\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.212808 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hjq7\" (UniqueName: \"kubernetes.io/projected/8468d903-d218-42b8-9621-6ec64ee2a7f9-kube-api-access-5hjq7\") pod \"nova-api-0\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.221730 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vft4l\" (UniqueName: \"kubernetes.io/projected/09e1cddd-f35d-4e93-9331-429675aa4275-kube-api-access-vft4l\") pod \"nova-scheduler-0\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.267026 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqtv8\" (UniqueName: \"kubernetes.io/projected/3398e0a0-9df1-442a-933b-cc289f5acfd4-kube-api-access-hqtv8\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.271723 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.271822 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-config-data\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.271977 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3398e0a0-9df1-442a-933b-cc289f5acfd4-logs\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.272730 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.272794 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.272888 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4nhv\" (UniqueName: \"kubernetes.io/projected/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-kube-api-access-w4nhv\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.272950 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.273079 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdgw4\" (UniqueName: \"kubernetes.io/projected/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-api-access-kdgw4\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.273111 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.273147 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.279146 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-568844f8ff-tk8hd"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.279824 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.280051 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.280918 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.290485 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.294522 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdgw4\" (UniqueName: \"kubernetes.io/projected/cb6a2b06-d8cb-4925-97c6-90172194a399-kube-api-access-kdgw4\") pod \"kube-state-metrics-0\" (UID: \"cb6a2b06-d8cb-4925-97c6-90172194a399\") " pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.335599 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568844f8ff-tk8hd"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.374860 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-svc\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375039 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3398e0a0-9df1-442a-933b-cc289f5acfd4-logs\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375069 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-swift-storage-0\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375101 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375151 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-config\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375193 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4nhv\" (UniqueName: \"kubernetes.io/projected/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-kube-api-access-w4nhv\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375215 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375246 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62kkg\" (UniqueName: \"kubernetes.io/projected/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-kube-api-access-62kkg\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375314 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-sb\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375375 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375410 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-nb\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375487 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqtv8\" (UniqueName: \"kubernetes.io/projected/3398e0a0-9df1-442a-933b-cc289f5acfd4-kube-api-access-hqtv8\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.375587 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-config-data\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.376647 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3398e0a0-9df1-442a-933b-cc289f5acfd4-logs\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.381035 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.389848 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-config-data\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.395612 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqtv8\" (UniqueName: \"kubernetes.io/projected/3398e0a0-9df1-442a-933b-cc289f5acfd4-kube-api-access-hqtv8\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.397667 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.398085 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.398890 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.407671 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4nhv\" (UniqueName: \"kubernetes.io/projected/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-kube-api-access-w4nhv\") pod \"nova-cell1-novncproxy-0\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.458040 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea978af1-b6d8-490b-8bfd-6b2ec699f47f" path="/var/lib/kubelet/pods/ea978af1-b6d8-490b-8bfd-6b2ec699f47f/volumes" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.477887 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-nb\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.478101 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-svc\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.478128 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-swift-storage-0\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.478176 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-config\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.478244 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62kkg\" (UniqueName: \"kubernetes.io/projected/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-kube-api-access-62kkg\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.478316 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-sb\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.479038 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-nb\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.479110 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-sb\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.479716 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-config\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.485606 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-swift-storage-0\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.490098 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-svc\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.497689 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62kkg\" (UniqueName: \"kubernetes.io/projected/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-kube-api-access-62kkg\") pod \"dnsmasq-dns-568844f8ff-tk8hd\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.512069 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.574043 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.591257 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.625073 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.636537 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.732292 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvkng"] Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.872926 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvkng" event={"ID":"654b6ea4-eb07-4074-a7ba-d743b87f6489","Type":"ContainerStarted","Data":"0254b9fac023d44ab4f7dd2b2755f5340c9b972d76fc2ca4909885bc67d3f750"} Dec 04 15:42:13 crc kubenswrapper[4676]: I1204 15:42:13.978064 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.022622 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8b49f"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.024259 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.028277 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.028574 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.050149 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8b49f"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.090721 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.090784 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg85x\" (UniqueName: \"kubernetes.io/projected/282e9515-3aa8-49a9-a752-253d7cdf6b9f-kube-api-access-mg85x\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.090852 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-scripts\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.090913 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-config-data\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.197588 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.197652 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg85x\" (UniqueName: \"kubernetes.io/projected/282e9515-3aa8-49a9-a752-253d7cdf6b9f-kube-api-access-mg85x\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.197714 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-scripts\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.197761 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-config-data\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.216111 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-config-data\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.217255 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-scripts\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.221587 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg85x\" (UniqueName: \"kubernetes.io/projected/282e9515-3aa8-49a9-a752-253d7cdf6b9f-kube-api-access-mg85x\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.235532 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-8b49f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.236864 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.301928 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.425638 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.450535 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.480651 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-568844f8ff-tk8hd"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.670854 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.671393 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-central-agent" containerID="cri-o://073b35788d7b55e6a297304e7a91f4550c1b34b3d45da7c1786c64c403eae7ef" gracePeriod=30 Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.671665 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="proxy-httpd" containerID="cri-o://071146b26b81472b41f05d348de7e09f7d76f620ca9b622c2cbf0ea683cb57df" gracePeriod=30 Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.671690 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="sg-core" containerID="cri-o://f2dac21b1543ca0262f785ff09ffe65b3f518d65e1c3b1142c67f9d294414482" gracePeriod=30 Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.671667 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-notification-agent" containerID="cri-o://2e979b0c5efea0d106eeab479a29cdf27cb8207e647fe27b6904c5ca051cd793" gracePeriod=30 Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.698839 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.896616 4676 generic.go:334] "Generic (PLEG): container finished" podID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerID="99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420" exitCode=0 Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.896698 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" event={"ID":"5a9e7336-af8d-48d4-82a4-3631cb57ecc8","Type":"ContainerDied","Data":"99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.896748 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" event={"ID":"5a9e7336-af8d-48d4-82a4-3631cb57ecc8","Type":"ContainerStarted","Data":"9407256b2495c0defca965c5dcfec1f7df79882b1ba59115fe7c7f3a1bebf82a"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.923079 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc5ec209-3e74-4d87-ba5f-d84052dd2c32","Type":"ContainerStarted","Data":"3749cf14963d6dad41d9c9242141823e535bbb9c28cbc4fc14b41179c883d22a"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.925781 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8468d903-d218-42b8-9621-6ec64ee2a7f9","Type":"ContainerStarted","Data":"535946f01a55da8dfff70b594522dd891e2b4fb8896e61ef610ca637ccf7b12c"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.933487 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09e1cddd-f35d-4e93-9331-429675aa4275","Type":"ContainerStarted","Data":"ce7f29dc159161bc9f4e8ebb83da8c730c9d8288e5f2a02f2bca2ad7f095ec53"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.941045 4676 generic.go:334] "Generic (PLEG): container finished" podID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerID="071146b26b81472b41f05d348de7e09f7d76f620ca9b622c2cbf0ea683cb57df" exitCode=0 Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.941084 4676 generic.go:334] "Generic (PLEG): container finished" podID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerID="f2dac21b1543ca0262f785ff09ffe65b3f518d65e1c3b1142c67f9d294414482" exitCode=2 Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.941134 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerDied","Data":"071146b26b81472b41f05d348de7e09f7d76f620ca9b622c2cbf0ea683cb57df"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.941166 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerDied","Data":"f2dac21b1543ca0262f785ff09ffe65b3f518d65e1c3b1142c67f9d294414482"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.947855 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cb6a2b06-d8cb-4925-97c6-90172194a399","Type":"ContainerStarted","Data":"4fd1919e04f8b690659e9fc987edc52575388007f5bb8dffb13462d8ef330131"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.949482 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3398e0a0-9df1-442a-933b-cc289f5acfd4","Type":"ContainerStarted","Data":"fffb40efd5bf9ac78787c4aac2c8eb0a7d65df0180d672c98e6e403ad2934205"} Dec 04 15:42:14 crc kubenswrapper[4676]: I1204 15:42:14.957447 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvkng" event={"ID":"654b6ea4-eb07-4074-a7ba-d743b87f6489","Type":"ContainerStarted","Data":"ace2ef9a22a2171efb1772e651ff254eca28043d331a6a0802a7a96c7ef94df2"} Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.015258 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mvkng" podStartSLOduration=3.015237107 podStartE2EDuration="3.015237107s" podCreationTimestamp="2025-12-04 15:42:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:14.987836362 +0000 UTC m=+1342.422506219" watchObservedRunningTime="2025-12-04 15:42:15.015237107 +0000 UTC m=+1342.449906954" Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.015898 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8b49f"] Dec 04 15:42:15 crc kubenswrapper[4676]: W1204 15:42:15.093304 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod282e9515_3aa8_49a9_a752_253d7cdf6b9f.slice/crio-55ba86c092bdbfaddb268ae6541460acd777fbf137f718553ef1e481298d15bd WatchSource:0}: Error finding container 55ba86c092bdbfaddb268ae6541460acd777fbf137f718553ef1e481298d15bd: Status 404 returned error can't find the container with id 55ba86c092bdbfaddb268ae6541460acd777fbf137f718553ef1e481298d15bd Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.976666 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" event={"ID":"5a9e7336-af8d-48d4-82a4-3631cb57ecc8","Type":"ContainerStarted","Data":"66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c"} Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.977112 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.984201 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8b49f" event={"ID":"282e9515-3aa8-49a9-a752-253d7cdf6b9f","Type":"ContainerStarted","Data":"40a9ab17647fbd3e6a32f5508d34aeafa1667d10a701981925d5b888ca267998"} Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.984252 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8b49f" event={"ID":"282e9515-3aa8-49a9-a752-253d7cdf6b9f","Type":"ContainerStarted","Data":"55ba86c092bdbfaddb268ae6541460acd777fbf137f718553ef1e481298d15bd"} Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.988427 4676 generic.go:334] "Generic (PLEG): container finished" podID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerID="073b35788d7b55e6a297304e7a91f4550c1b34b3d45da7c1786c64c403eae7ef" exitCode=0 Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.988504 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerDied","Data":"073b35788d7b55e6a297304e7a91f4550c1b34b3d45da7c1786c64c403eae7ef"} Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.990899 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cb6a2b06-d8cb-4925-97c6-90172194a399","Type":"ContainerStarted","Data":"ad4a271ea0f5f85391b15ab02b06c7cc2d50819676512d06f340311796072c2e"} Dec 04 15:42:15 crc kubenswrapper[4676]: I1204 15:42:15.991160 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 04 15:42:16 crc kubenswrapper[4676]: I1204 15:42:16.038895 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" podStartSLOduration=3.038863618 podStartE2EDuration="3.038863618s" podCreationTimestamp="2025-12-04 15:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:16.002894035 +0000 UTC m=+1343.437563902" watchObservedRunningTime="2025-12-04 15:42:16.038863618 +0000 UTC m=+1343.473533475" Dec 04 15:42:16 crc kubenswrapper[4676]: I1204 15:42:16.049633 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-8b49f" podStartSLOduration=3.04960454 podStartE2EDuration="3.04960454s" podCreationTimestamp="2025-12-04 15:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:16.041012551 +0000 UTC m=+1343.475682418" watchObservedRunningTime="2025-12-04 15:42:16.04960454 +0000 UTC m=+1343.484274397" Dec 04 15:42:16 crc kubenswrapper[4676]: I1204 15:42:16.078482 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.586672976 podStartE2EDuration="4.078459076s" podCreationTimestamp="2025-12-04 15:42:12 +0000 UTC" firstStartedPulling="2025-12-04 15:42:14.239292437 +0000 UTC m=+1341.673962294" lastFinishedPulling="2025-12-04 15:42:14.731078537 +0000 UTC m=+1342.165748394" observedRunningTime="2025-12-04 15:42:16.074171272 +0000 UTC m=+1343.508841129" watchObservedRunningTime="2025-12-04 15:42:16.078459076 +0000 UTC m=+1343.513128933" Dec 04 15:42:16 crc kubenswrapper[4676]: I1204 15:42:16.528810 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:16 crc kubenswrapper[4676]: I1204 15:42:16.551620 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:17 crc kubenswrapper[4676]: I1204 15:42:17.004033 4676 generic.go:334] "Generic (PLEG): container finished" podID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerID="2e979b0c5efea0d106eeab479a29cdf27cb8207e647fe27b6904c5ca051cd793" exitCode=0 Dec 04 15:42:17 crc kubenswrapper[4676]: I1204 15:42:17.004139 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerDied","Data":"2e979b0c5efea0d106eeab479a29cdf27cb8207e647fe27b6904c5ca051cd793"} Dec 04 15:42:17 crc kubenswrapper[4676]: I1204 15:42:17.820622 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.000548 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-config-data\") pod \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.000622 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-run-httpd\") pod \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.000719 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-log-httpd\") pod \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.000816 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6hlk\" (UniqueName: \"kubernetes.io/projected/4adc98a9-0a54-437f-a041-0a4a1f6deac9-kube-api-access-l6hlk\") pod \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.000862 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-combined-ca-bundle\") pod \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.000966 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-scripts\") pod \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.001003 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-sg-core-conf-yaml\") pod \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\" (UID: \"4adc98a9-0a54-437f-a041-0a4a1f6deac9\") " Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.002586 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4adc98a9-0a54-437f-a041-0a4a1f6deac9" (UID: "4adc98a9-0a54-437f-a041-0a4a1f6deac9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.002805 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4adc98a9-0a54-437f-a041-0a4a1f6deac9" (UID: "4adc98a9-0a54-437f-a041-0a4a1f6deac9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.007350 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adc98a9-0a54-437f-a041-0a4a1f6deac9-kube-api-access-l6hlk" (OuterVolumeSpecName: "kube-api-access-l6hlk") pod "4adc98a9-0a54-437f-a041-0a4a1f6deac9" (UID: "4adc98a9-0a54-437f-a041-0a4a1f6deac9"). InnerVolumeSpecName "kube-api-access-l6hlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.010791 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-scripts" (OuterVolumeSpecName: "scripts") pod "4adc98a9-0a54-437f-a041-0a4a1f6deac9" (UID: "4adc98a9-0a54-437f-a041-0a4a1f6deac9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.022254 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fc5ec209-3e74-4d87-ba5f-d84052dd2c32" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3" gracePeriod=30 Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.022423 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc5ec209-3e74-4d87-ba5f-d84052dd2c32","Type":"ContainerStarted","Data":"9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3"} Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.024145 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8468d903-d218-42b8-9621-6ec64ee2a7f9","Type":"ContainerStarted","Data":"7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd"} Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.026785 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09e1cddd-f35d-4e93-9331-429675aa4275","Type":"ContainerStarted","Data":"78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a"} Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.087757 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4adc98a9-0a54-437f-a041-0a4a1f6deac9","Type":"ContainerDied","Data":"e8a1f23ea92929ddcb7999aa5c90e22eb52f3d549b0054a8f22a6e2a2bc892f9"} Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.087818 4676 scope.go:117] "RemoveContainer" containerID="071146b26b81472b41f05d348de7e09f7d76f620ca9b622c2cbf0ea683cb57df" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.087985 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.093825 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3398e0a0-9df1-442a-933b-cc289f5acfd4","Type":"ContainerStarted","Data":"13232c1c155f68f30e6bedc92ff8d1b6ddf0179df6e23bc23c4f0fca644e0ab3"} Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.098466 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.368685866 podStartE2EDuration="5.098445998s" podCreationTimestamp="2025-12-04 15:42:13 +0000 UTC" firstStartedPulling="2025-12-04 15:42:14.749037398 +0000 UTC m=+1342.183707255" lastFinishedPulling="2025-12-04 15:42:17.47879752 +0000 UTC m=+1344.913467387" observedRunningTime="2025-12-04 15:42:18.040385104 +0000 UTC m=+1345.475054961" watchObservedRunningTime="2025-12-04 15:42:18.098445998 +0000 UTC m=+1345.533115875" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.102375 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.636568977 podStartE2EDuration="6.102364202s" podCreationTimestamp="2025-12-04 15:42:12 +0000 UTC" firstStartedPulling="2025-12-04 15:42:14.011374728 +0000 UTC m=+1341.446044585" lastFinishedPulling="2025-12-04 15:42:17.477169953 +0000 UTC m=+1344.911839810" observedRunningTime="2025-12-04 15:42:18.095373809 +0000 UTC m=+1345.530043666" watchObservedRunningTime="2025-12-04 15:42:18.102364202 +0000 UTC m=+1345.537034059" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.105154 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4adc98a9-0a54-437f-a041-0a4a1f6deac9" (UID: "4adc98a9-0a54-437f-a041-0a4a1f6deac9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.105702 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.105726 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.105741 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.105754 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4adc98a9-0a54-437f-a041-0a4a1f6deac9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.105768 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6hlk\" (UniqueName: \"kubernetes.io/projected/4adc98a9-0a54-437f-a041-0a4a1f6deac9-kube-api-access-l6hlk\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.121356 4676 scope.go:117] "RemoveContainer" containerID="f2dac21b1543ca0262f785ff09ffe65b3f518d65e1c3b1142c67f9d294414482" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.149494 4676 scope.go:117] "RemoveContainer" containerID="2e979b0c5efea0d106eeab479a29cdf27cb8207e647fe27b6904c5ca051cd793" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.193975 4676 scope.go:117] "RemoveContainer" containerID="073b35788d7b55e6a297304e7a91f4550c1b34b3d45da7c1786c64c403eae7ef" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.237556 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4adc98a9-0a54-437f-a041-0a4a1f6deac9" (UID: "4adc98a9-0a54-437f-a041-0a4a1f6deac9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.264081 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-config-data" (OuterVolumeSpecName: "config-data") pod "4adc98a9-0a54-437f-a041-0a4a1f6deac9" (UID: "4adc98a9-0a54-437f-a041-0a4a1f6deac9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.310230 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.310445 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4adc98a9-0a54-437f-a041-0a4a1f6deac9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.382657 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.431834 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.446792 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.472156 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:42:18 crc kubenswrapper[4676]: E1204 15:42:18.472786 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-central-agent" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.472805 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-central-agent" Dec 04 15:42:18 crc kubenswrapper[4676]: E1204 15:42:18.472836 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="proxy-httpd" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.472843 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="proxy-httpd" Dec 04 15:42:18 crc kubenswrapper[4676]: E1204 15:42:18.472859 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="sg-core" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.472865 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="sg-core" Dec 04 15:42:18 crc kubenswrapper[4676]: E1204 15:42:18.472953 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-notification-agent" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.472964 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-notification-agent" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.473213 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-notification-agent" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.473226 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="sg-core" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.473237 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="ceilometer-central-agent" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.473248 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" containerName="proxy-httpd" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.475777 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.479946 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.480103 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.490276 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.519287 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.619260 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-scripts\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.619406 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.619457 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-log-httpd\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.619524 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-run-httpd\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.619717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgc7c\" (UniqueName: \"kubernetes.io/projected/6387383e-d39c-4e26-b204-1fedb37707b0-kube-api-access-xgc7c\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.619795 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.620309 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.620680 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-config-data\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.621171 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.722650 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgc7c\" (UniqueName: \"kubernetes.io/projected/6387383e-d39c-4e26-b204-1fedb37707b0-kube-api-access-xgc7c\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.722719 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.722769 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.722830 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-config-data\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.722930 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-scripts\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.722963 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.722983 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-log-httpd\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.723021 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-run-httpd\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.723502 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-run-httpd\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.724286 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-log-httpd\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.728484 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.731659 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-config-data\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.733206 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-scripts\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.734222 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.741257 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.747524 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgc7c\" (UniqueName: \"kubernetes.io/projected/6387383e-d39c-4e26-b204-1fedb37707b0-kube-api-access-xgc7c\") pod \"ceilometer-0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " pod="openstack/ceilometer-0" Dec 04 15:42:18 crc kubenswrapper[4676]: I1204 15:42:18.898455 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.107139 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8468d903-d218-42b8-9621-6ec64ee2a7f9","Type":"ContainerStarted","Data":"e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac"} Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.115670 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-log" containerID="cri-o://13232c1c155f68f30e6bedc92ff8d1b6ddf0179df6e23bc23c4f0fca644e0ab3" gracePeriod=30 Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.116095 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3398e0a0-9df1-442a-933b-cc289f5acfd4","Type":"ContainerStarted","Data":"ee9c0b2da9c7c57f2605a18a41032b7e3e805b18ce2debda4b5a6110cc5628d0"} Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.116179 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-metadata" containerID="cri-o://ee9c0b2da9c7c57f2605a18a41032b7e3e805b18ce2debda4b5a6110cc5628d0" gracePeriod=30 Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.153406 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.993433952 podStartE2EDuration="7.153377368s" podCreationTimestamp="2025-12-04 15:42:12 +0000 UTC" firstStartedPulling="2025-12-04 15:42:14.368615987 +0000 UTC m=+1341.803285844" lastFinishedPulling="2025-12-04 15:42:17.528559403 +0000 UTC m=+1344.963229260" observedRunningTime="2025-12-04 15:42:19.140110783 +0000 UTC m=+1346.574780660" watchObservedRunningTime="2025-12-04 15:42:19.153377368 +0000 UTC m=+1346.588047235" Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.176790 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.100671891 podStartE2EDuration="7.176766076s" podCreationTimestamp="2025-12-04 15:42:12 +0000 UTC" firstStartedPulling="2025-12-04 15:42:14.423015314 +0000 UTC m=+1341.857685171" lastFinishedPulling="2025-12-04 15:42:17.499109509 +0000 UTC m=+1344.933779356" observedRunningTime="2025-12-04 15:42:19.162727129 +0000 UTC m=+1346.597396976" watchObservedRunningTime="2025-12-04 15:42:19.176766076 +0000 UTC m=+1346.611435933" Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.396691 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4adc98a9-0a54-437f-a041-0a4a1f6deac9" path="/var/lib/kubelet/pods/4adc98a9-0a54-437f-a041-0a4a1f6deac9/volumes" Dec 04 15:42:19 crc kubenswrapper[4676]: I1204 15:42:19.435236 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.128778 4676 generic.go:334] "Generic (PLEG): container finished" podID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerID="ee9c0b2da9c7c57f2605a18a41032b7e3e805b18ce2debda4b5a6110cc5628d0" exitCode=0 Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.129151 4676 generic.go:334] "Generic (PLEG): container finished" podID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerID="13232c1c155f68f30e6bedc92ff8d1b6ddf0179df6e23bc23c4f0fca644e0ab3" exitCode=143 Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.128845 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3398e0a0-9df1-442a-933b-cc289f5acfd4","Type":"ContainerDied","Data":"ee9c0b2da9c7c57f2605a18a41032b7e3e805b18ce2debda4b5a6110cc5628d0"} Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.129254 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3398e0a0-9df1-442a-933b-cc289f5acfd4","Type":"ContainerDied","Data":"13232c1c155f68f30e6bedc92ff8d1b6ddf0179df6e23bc23c4f0fca644e0ab3"} Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.133156 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerStarted","Data":"abbf5f27a34c6a1ef9d66701df78576b922ac79a21a77569cfb3ad1305989ef0"} Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.328761 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.375500 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3398e0a0-9df1-442a-933b-cc289f5acfd4-logs\") pod \"3398e0a0-9df1-442a-933b-cc289f5acfd4\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.375859 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqtv8\" (UniqueName: \"kubernetes.io/projected/3398e0a0-9df1-442a-933b-cc289f5acfd4-kube-api-access-hqtv8\") pod \"3398e0a0-9df1-442a-933b-cc289f5acfd4\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.376187 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3398e0a0-9df1-442a-933b-cc289f5acfd4-logs" (OuterVolumeSpecName: "logs") pod "3398e0a0-9df1-442a-933b-cc289f5acfd4" (UID: "3398e0a0-9df1-442a-933b-cc289f5acfd4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.376327 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-config-data\") pod \"3398e0a0-9df1-442a-933b-cc289f5acfd4\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.376510 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-combined-ca-bundle\") pod \"3398e0a0-9df1-442a-933b-cc289f5acfd4\" (UID: \"3398e0a0-9df1-442a-933b-cc289f5acfd4\") " Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.377568 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3398e0a0-9df1-442a-933b-cc289f5acfd4-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.383003 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3398e0a0-9df1-442a-933b-cc289f5acfd4-kube-api-access-hqtv8" (OuterVolumeSpecName: "kube-api-access-hqtv8") pod "3398e0a0-9df1-442a-933b-cc289f5acfd4" (UID: "3398e0a0-9df1-442a-933b-cc289f5acfd4"). InnerVolumeSpecName "kube-api-access-hqtv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.407428 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-config-data" (OuterVolumeSpecName: "config-data") pod "3398e0a0-9df1-442a-933b-cc289f5acfd4" (UID: "3398e0a0-9df1-442a-933b-cc289f5acfd4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.451063 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3398e0a0-9df1-442a-933b-cc289f5acfd4" (UID: "3398e0a0-9df1-442a-933b-cc289f5acfd4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.479798 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.479846 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3398e0a0-9df1-442a-933b-cc289f5acfd4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:20 crc kubenswrapper[4676]: I1204 15:42:20.479862 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqtv8\" (UniqueName: \"kubernetes.io/projected/3398e0a0-9df1-442a-933b-cc289f5acfd4-kube-api-access-hqtv8\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.188096 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerStarted","Data":"1adc45a5d71cebb78915732286b10a517cb3922fcca22a36b3088503fe202639"} Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.188406 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerStarted","Data":"53c496fbeeec7bc1df17ed072378d4c91583dadc7ef052f9d71135e5f817c369"} Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.201567 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3398e0a0-9df1-442a-933b-cc289f5acfd4","Type":"ContainerDied","Data":"fffb40efd5bf9ac78787c4aac2c8eb0a7d65df0180d672c98e6e403ad2934205"} Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.201651 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.201659 4676 scope.go:117] "RemoveContainer" containerID="ee9c0b2da9c7c57f2605a18a41032b7e3e805b18ce2debda4b5a6110cc5628d0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.250098 4676 scope.go:117] "RemoveContainer" containerID="13232c1c155f68f30e6bedc92ff8d1b6ddf0179df6e23bc23c4f0fca644e0ab3" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.268970 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.283993 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.297802 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:21 crc kubenswrapper[4676]: E1204 15:42:21.298280 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-metadata" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.298299 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-metadata" Dec 04 15:42:21 crc kubenswrapper[4676]: E1204 15:42:21.298317 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-log" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.298324 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-log" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.298494 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-metadata" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.298515 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" containerName="nova-metadata-log" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.299687 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.303270 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.309100 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.309256 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.427106 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3398e0a0-9df1-442a-933b-cc289f5acfd4" path="/var/lib/kubelet/pods/3398e0a0-9df1-442a-933b-cc289f5acfd4/volumes" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.461233 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.461273 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p76kz\" (UniqueName: \"kubernetes.io/projected/21c35412-68e4-4dd6-9ddf-3a72053bb40f-kube-api-access-p76kz\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.461535 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c35412-68e4-4dd6-9ddf-3a72053bb40f-logs\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.461590 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.461614 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-config-data\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.564139 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c35412-68e4-4dd6-9ddf-3a72053bb40f-logs\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.564508 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-config-data\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.564532 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.564719 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.564746 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p76kz\" (UniqueName: \"kubernetes.io/projected/21c35412-68e4-4dd6-9ddf-3a72053bb40f-kube-api-access-p76kz\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.564957 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c35412-68e4-4dd6-9ddf-3a72053bb40f-logs\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.570280 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-config-data\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.570831 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.571679 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.593508 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p76kz\" (UniqueName: \"kubernetes.io/projected/21c35412-68e4-4dd6-9ddf-3a72053bb40f-kube-api-access-p76kz\") pod \"nova-metadata-0\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " pod="openstack/nova-metadata-0" Dec 04 15:42:21 crc kubenswrapper[4676]: I1204 15:42:21.643198 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:22 crc kubenswrapper[4676]: I1204 15:42:22.223712 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:22 crc kubenswrapper[4676]: W1204 15:42:22.230308 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21c35412_68e4_4dd6_9ddf_3a72053bb40f.slice/crio-1bdd363c0aa8d9b7559821ef39e628686f00d2a0c1529a94dd177aaf8eaf37b9 WatchSource:0}: Error finding container 1bdd363c0aa8d9b7559821ef39e628686f00d2a0c1529a94dd177aaf8eaf37b9: Status 404 returned error can't find the container with id 1bdd363c0aa8d9b7559821ef39e628686f00d2a0c1529a94dd177aaf8eaf37b9 Dec 04 15:42:22 crc kubenswrapper[4676]: I1204 15:42:22.243998 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerStarted","Data":"04227894183ed8d0beb68429ee581220ad0e5bdfa734ac33c5803dc160c87425"} Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.274515 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerStarted","Data":"281eaacb3a89151d12b17dea8acebed832c5cb09033ded40057fec56ff6a05a5"} Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.274790 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.279347 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21c35412-68e4-4dd6-9ddf-3a72053bb40f","Type":"ContainerStarted","Data":"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66"} Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.279461 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21c35412-68e4-4dd6-9ddf-3a72053bb40f","Type":"ContainerStarted","Data":"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d"} Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.279544 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21c35412-68e4-4dd6-9ddf-3a72053bb40f","Type":"ContainerStarted","Data":"1bdd363c0aa8d9b7559821ef39e628686f00d2a0c1529a94dd177aaf8eaf37b9"} Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.337648 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.3376026850000002 podStartE2EDuration="2.337602685s" podCreationTimestamp="2025-12-04 15:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:23.327487451 +0000 UTC m=+1350.762157328" watchObservedRunningTime="2025-12-04 15:42:23.337602685 +0000 UTC m=+1350.772272542" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.342014 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.050195892 podStartE2EDuration="5.341998622s" podCreationTimestamp="2025-12-04 15:42:18 +0000 UTC" firstStartedPulling="2025-12-04 15:42:19.451057999 +0000 UTC m=+1346.885727846" lastFinishedPulling="2025-12-04 15:42:22.742860719 +0000 UTC m=+1350.177530576" observedRunningTime="2025-12-04 15:42:23.308972594 +0000 UTC m=+1350.743642451" watchObservedRunningTime="2025-12-04 15:42:23.341998622 +0000 UTC m=+1350.776668479" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.382115 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.423525 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.512966 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.513048 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.584465 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.639915 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.716727 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-866f9499b7-bl2lr"] Dec 04 15:42:23 crc kubenswrapper[4676]: I1204 15:42:23.717124 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" podUID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerName="dnsmasq-dns" containerID="cri-o://c82492a192734375701e59a66be12946fadc6db4a6f6b952e3ed209ee42a79d2" gracePeriod=10 Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.292672 4676 generic.go:334] "Generic (PLEG): container finished" podID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerID="c82492a192734375701e59a66be12946fadc6db4a6f6b952e3ed209ee42a79d2" exitCode=0 Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.292931 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" event={"ID":"5e9cb383-58a8-45a6-86cf-85b52dd3311b","Type":"ContainerDied","Data":"c82492a192734375701e59a66be12946fadc6db4a6f6b952e3ed209ee42a79d2"} Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.293869 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" event={"ID":"5e9cb383-58a8-45a6-86cf-85b52dd3311b","Type":"ContainerDied","Data":"eb5e904a4fc3c5162eacec62c7aefa40a3dedc4ce4b29a9631080459a7f5ca35"} Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.293892 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb5e904a4fc3c5162eacec62c7aefa40a3dedc4ce4b29a9631080459a7f5ca35" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.327373 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.337175 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.455006 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-svc\") pod \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.455119 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-nb\") pod \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.455149 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-config\") pod \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.455217 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-sb\") pod \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.455234 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh4f4\" (UniqueName: \"kubernetes.io/projected/5e9cb383-58a8-45a6-86cf-85b52dd3311b-kube-api-access-rh4f4\") pod \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.455361 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-swift-storage-0\") pod \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\" (UID: \"5e9cb383-58a8-45a6-86cf-85b52dd3311b\") " Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.461510 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9cb383-58a8-45a6-86cf-85b52dd3311b-kube-api-access-rh4f4" (OuterVolumeSpecName: "kube-api-access-rh4f4") pod "5e9cb383-58a8-45a6-86cf-85b52dd3311b" (UID: "5e9cb383-58a8-45a6-86cf-85b52dd3311b"). InnerVolumeSpecName "kube-api-access-rh4f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.513052 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.513414 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.561560 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh4f4\" (UniqueName: \"kubernetes.io/projected/5e9cb383-58a8-45a6-86cf-85b52dd3311b-kube-api-access-rh4f4\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.580950 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-config" (OuterVolumeSpecName: "config") pod "5e9cb383-58a8-45a6-86cf-85b52dd3311b" (UID: "5e9cb383-58a8-45a6-86cf-85b52dd3311b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.599527 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5e9cb383-58a8-45a6-86cf-85b52dd3311b" (UID: "5e9cb383-58a8-45a6-86cf-85b52dd3311b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.601366 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e9cb383-58a8-45a6-86cf-85b52dd3311b" (UID: "5e9cb383-58a8-45a6-86cf-85b52dd3311b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.630578 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e9cb383-58a8-45a6-86cf-85b52dd3311b" (UID: "5e9cb383-58a8-45a6-86cf-85b52dd3311b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.634491 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e9cb383-58a8-45a6-86cf-85b52dd3311b" (UID: "5e9cb383-58a8-45a6-86cf-85b52dd3311b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.663080 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.663135 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.663145 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.663154 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:24 crc kubenswrapper[4676]: I1204 15:42:24.663163 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9cb383-58a8-45a6-86cf-85b52dd3311b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:25 crc kubenswrapper[4676]: I1204 15:42:25.306141 4676 generic.go:334] "Generic (PLEG): container finished" podID="654b6ea4-eb07-4074-a7ba-d743b87f6489" containerID="ace2ef9a22a2171efb1772e651ff254eca28043d331a6a0802a7a96c7ef94df2" exitCode=0 Dec 04 15:42:25 crc kubenswrapper[4676]: I1204 15:42:25.306252 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvkng" event={"ID":"654b6ea4-eb07-4074-a7ba-d743b87f6489","Type":"ContainerDied","Data":"ace2ef9a22a2171efb1772e651ff254eca28043d331a6a0802a7a96c7ef94df2"} Dec 04 15:42:25 crc kubenswrapper[4676]: I1204 15:42:25.306303 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-866f9499b7-bl2lr" Dec 04 15:42:25 crc kubenswrapper[4676]: I1204 15:42:25.352474 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-866f9499b7-bl2lr"] Dec 04 15:42:25 crc kubenswrapper[4676]: I1204 15:42:25.362360 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-866f9499b7-bl2lr"] Dec 04 15:42:25 crc kubenswrapper[4676]: I1204 15:42:25.397010 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" path="/var/lib/kubelet/pods/5e9cb383-58a8-45a6-86cf-85b52dd3311b/volumes" Dec 04 15:42:26 crc kubenswrapper[4676]: I1204 15:42:26.648012 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 15:42:26 crc kubenswrapper[4676]: I1204 15:42:26.654989 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 15:42:26 crc kubenswrapper[4676]: I1204 15:42:26.885472 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.050194 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-combined-ca-bundle\") pod \"654b6ea4-eb07-4074-a7ba-d743b87f6489\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.050252 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgrzk\" (UniqueName: \"kubernetes.io/projected/654b6ea4-eb07-4074-a7ba-d743b87f6489-kube-api-access-vgrzk\") pod \"654b6ea4-eb07-4074-a7ba-d743b87f6489\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.050393 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-scripts\") pod \"654b6ea4-eb07-4074-a7ba-d743b87f6489\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.051104 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-config-data\") pod \"654b6ea4-eb07-4074-a7ba-d743b87f6489\" (UID: \"654b6ea4-eb07-4074-a7ba-d743b87f6489\") " Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.064109 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-scripts" (OuterVolumeSpecName: "scripts") pod "654b6ea4-eb07-4074-a7ba-d743b87f6489" (UID: "654b6ea4-eb07-4074-a7ba-d743b87f6489"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.064210 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654b6ea4-eb07-4074-a7ba-d743b87f6489-kube-api-access-vgrzk" (OuterVolumeSpecName: "kube-api-access-vgrzk") pod "654b6ea4-eb07-4074-a7ba-d743b87f6489" (UID: "654b6ea4-eb07-4074-a7ba-d743b87f6489"). InnerVolumeSpecName "kube-api-access-vgrzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.082099 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-config-data" (OuterVolumeSpecName: "config-data") pod "654b6ea4-eb07-4074-a7ba-d743b87f6489" (UID: "654b6ea4-eb07-4074-a7ba-d743b87f6489"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.090032 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "654b6ea4-eb07-4074-a7ba-d743b87f6489" (UID: "654b6ea4-eb07-4074-a7ba-d743b87f6489"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.153078 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.153120 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgrzk\" (UniqueName: \"kubernetes.io/projected/654b6ea4-eb07-4074-a7ba-d743b87f6489-kube-api-access-vgrzk\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.153134 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.153146 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/654b6ea4-eb07-4074-a7ba-d743b87f6489-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.332938 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mvkng" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.337041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mvkng" event={"ID":"654b6ea4-eb07-4074-a7ba-d743b87f6489","Type":"ContainerDied","Data":"0254b9fac023d44ab4f7dd2b2755f5340c9b972d76fc2ca4909885bc67d3f750"} Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.337107 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0254b9fac023d44ab4f7dd2b2755f5340c9b972d76fc2ca4909885bc67d3f750" Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.470250 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.470595 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-log" containerID="cri-o://7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd" gracePeriod=30 Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.629437 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-api" containerID="cri-o://e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac" gracePeriod=30 Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.674453 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.675214 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="09e1cddd-f35d-4e93-9331-429675aa4275" containerName="nova-scheduler-scheduler" containerID="cri-o://78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" gracePeriod=30 Dec 04 15:42:27 crc kubenswrapper[4676]: I1204 15:42:27.719115 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:28 crc kubenswrapper[4676]: I1204 15:42:28.341864 4676 generic.go:334] "Generic (PLEG): container finished" podID="282e9515-3aa8-49a9-a752-253d7cdf6b9f" containerID="40a9ab17647fbd3e6a32f5508d34aeafa1667d10a701981925d5b888ca267998" exitCode=0 Dec 04 15:42:28 crc kubenswrapper[4676]: I1204 15:42:28.341943 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8b49f" event={"ID":"282e9515-3aa8-49a9-a752-253d7cdf6b9f","Type":"ContainerDied","Data":"40a9ab17647fbd3e6a32f5508d34aeafa1667d10a701981925d5b888ca267998"} Dec 04 15:42:28 crc kubenswrapper[4676]: I1204 15:42:28.345495 4676 generic.go:334] "Generic (PLEG): container finished" podID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerID="7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd" exitCode=143 Dec 04 15:42:28 crc kubenswrapper[4676]: I1204 15:42:28.345577 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8468d903-d218-42b8-9621-6ec64ee2a7f9","Type":"ContainerDied","Data":"7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd"} Dec 04 15:42:28 crc kubenswrapper[4676]: I1204 15:42:28.345685 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-log" containerID="cri-o://b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d" gracePeriod=30 Dec 04 15:42:28 crc kubenswrapper[4676]: I1204 15:42:28.345728 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-metadata" containerID="cri-o://aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66" gracePeriod=30 Dec 04 15:42:28 crc kubenswrapper[4676]: E1204 15:42:28.385877 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 15:42:28 crc kubenswrapper[4676]: E1204 15:42:28.404245 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 15:42:28 crc kubenswrapper[4676]: E1204 15:42:28.564546 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 15:42:28 crc kubenswrapper[4676]: E1204 15:42:28.564757 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="09e1cddd-f35d-4e93-9331-429675aa4275" containerName="nova-scheduler-scheduler" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.243360 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.359563 4676 generic.go:334] "Generic (PLEG): container finished" podID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerID="aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66" exitCode=0 Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.359606 4676 generic.go:334] "Generic (PLEG): container finished" podID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerID="b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d" exitCode=143 Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.359667 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.359753 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21c35412-68e4-4dd6-9ddf-3a72053bb40f","Type":"ContainerDied","Data":"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66"} Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.359821 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21c35412-68e4-4dd6-9ddf-3a72053bb40f","Type":"ContainerDied","Data":"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d"} Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.359838 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"21c35412-68e4-4dd6-9ddf-3a72053bb40f","Type":"ContainerDied","Data":"1bdd363c0aa8d9b7559821ef39e628686f00d2a0c1529a94dd177aaf8eaf37b9"} Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.359954 4676 scope.go:117] "RemoveContainer" containerID="aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.381111 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-nova-metadata-tls-certs\") pod \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.381174 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c35412-68e4-4dd6-9ddf-3a72053bb40f-logs\") pod \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.381227 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p76kz\" (UniqueName: \"kubernetes.io/projected/21c35412-68e4-4dd6-9ddf-3a72053bb40f-kube-api-access-p76kz\") pod \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.381250 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-config-data\") pod \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.381364 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-combined-ca-bundle\") pod \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\" (UID: \"21c35412-68e4-4dd6-9ddf-3a72053bb40f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.383252 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21c35412-68e4-4dd6-9ddf-3a72053bb40f-logs" (OuterVolumeSpecName: "logs") pod "21c35412-68e4-4dd6-9ddf-3a72053bb40f" (UID: "21c35412-68e4-4dd6-9ddf-3a72053bb40f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.388134 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c35412-68e4-4dd6-9ddf-3a72053bb40f-kube-api-access-p76kz" (OuterVolumeSpecName: "kube-api-access-p76kz") pod "21c35412-68e4-4dd6-9ddf-3a72053bb40f" (UID: "21c35412-68e4-4dd6-9ddf-3a72053bb40f"). InnerVolumeSpecName "kube-api-access-p76kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.431285 4676 scope.go:117] "RemoveContainer" containerID="b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.437489 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-config-data" (OuterVolumeSpecName: "config-data") pod "21c35412-68e4-4dd6-9ddf-3a72053bb40f" (UID: "21c35412-68e4-4dd6-9ddf-3a72053bb40f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.453387 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21c35412-68e4-4dd6-9ddf-3a72053bb40f" (UID: "21c35412-68e4-4dd6-9ddf-3a72053bb40f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.483932 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.483971 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21c35412-68e4-4dd6-9ddf-3a72053bb40f-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.483984 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p76kz\" (UniqueName: \"kubernetes.io/projected/21c35412-68e4-4dd6-9ddf-3a72053bb40f-kube-api-access-p76kz\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.483994 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.492197 4676 scope.go:117] "RemoveContainer" containerID="aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66" Dec 04 15:42:29 crc kubenswrapper[4676]: E1204 15:42:29.492976 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66\": container with ID starting with aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66 not found: ID does not exist" containerID="aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.493053 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66"} err="failed to get container status \"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66\": rpc error: code = NotFound desc = could not find container \"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66\": container with ID starting with aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66 not found: ID does not exist" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.493085 4676 scope.go:117] "RemoveContainer" containerID="b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.500156 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "21c35412-68e4-4dd6-9ddf-3a72053bb40f" (UID: "21c35412-68e4-4dd6-9ddf-3a72053bb40f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: E1204 15:42:29.500175 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d\": container with ID starting with b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d not found: ID does not exist" containerID="b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.500251 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d"} err="failed to get container status \"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d\": rpc error: code = NotFound desc = could not find container \"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d\": container with ID starting with b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d not found: ID does not exist" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.500286 4676 scope.go:117] "RemoveContainer" containerID="aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.501509 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66"} err="failed to get container status \"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66\": rpc error: code = NotFound desc = could not find container \"aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66\": container with ID starting with aec314922407d4921be0288096ae8e27ba57a9cbfa4b8c5faa60ee7a6918ef66 not found: ID does not exist" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.501549 4676 scope.go:117] "RemoveContainer" containerID="b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.501857 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d"} err="failed to get container status \"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d\": rpc error: code = NotFound desc = could not find container \"b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d\": container with ID starting with b11e04c2d99612aa8921cd064e7f0ce913ea60f33c921c03546dd5eba660187d not found: ID does not exist" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.588793 4676 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/21c35412-68e4-4dd6-9ddf-3a72053bb40f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.703554 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.717053 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.745468 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:29 crc kubenswrapper[4676]: E1204 15:42:29.746175 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerName="init" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746208 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerName="init" Dec 04 15:42:29 crc kubenswrapper[4676]: E1204 15:42:29.746251 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-metadata" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746263 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-metadata" Dec 04 15:42:29 crc kubenswrapper[4676]: E1204 15:42:29.746301 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654b6ea4-eb07-4074-a7ba-d743b87f6489" containerName="nova-manage" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746309 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="654b6ea4-eb07-4074-a7ba-d743b87f6489" containerName="nova-manage" Dec 04 15:42:29 crc kubenswrapper[4676]: E1204 15:42:29.746342 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerName="dnsmasq-dns" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746351 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerName="dnsmasq-dns" Dec 04 15:42:29 crc kubenswrapper[4676]: E1204 15:42:29.746366 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-log" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746374 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-log" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746834 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9cb383-58a8-45a6-86cf-85b52dd3311b" containerName="dnsmasq-dns" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746875 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="654b6ea4-eb07-4074-a7ba-d743b87f6489" containerName="nova-manage" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746953 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-metadata" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.746996 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" containerName="nova-metadata-log" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.749301 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.753391 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.753607 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.756608 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.804112 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.904196 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-config-data\") pod \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.904366 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-combined-ca-bundle\") pod \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.904407 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg85x\" (UniqueName: \"kubernetes.io/projected/282e9515-3aa8-49a9-a752-253d7cdf6b9f-kube-api-access-mg85x\") pod \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.904471 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-scripts\") pod \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\" (UID: \"282e9515-3aa8-49a9-a752-253d7cdf6b9f\") " Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.904789 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9683823e-29da-45e3-a662-84320cc6a8aa-logs\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.904861 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.905226 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-config-data\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.905302 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.905362 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6rsc\" (UniqueName: \"kubernetes.io/projected/9683823e-29da-45e3-a662-84320cc6a8aa-kube-api-access-w6rsc\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.908397 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/282e9515-3aa8-49a9-a752-253d7cdf6b9f-kube-api-access-mg85x" (OuterVolumeSpecName: "kube-api-access-mg85x") pod "282e9515-3aa8-49a9-a752-253d7cdf6b9f" (UID: "282e9515-3aa8-49a9-a752-253d7cdf6b9f"). InnerVolumeSpecName "kube-api-access-mg85x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.917441 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-scripts" (OuterVolumeSpecName: "scripts") pod "282e9515-3aa8-49a9-a752-253d7cdf6b9f" (UID: "282e9515-3aa8-49a9-a752-253d7cdf6b9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.931023 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-config-data" (OuterVolumeSpecName: "config-data") pod "282e9515-3aa8-49a9-a752-253d7cdf6b9f" (UID: "282e9515-3aa8-49a9-a752-253d7cdf6b9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:29 crc kubenswrapper[4676]: I1204 15:42:29.936058 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "282e9515-3aa8-49a9-a752-253d7cdf6b9f" (UID: "282e9515-3aa8-49a9-a752-253d7cdf6b9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.007472 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-config-data\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.007726 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.007839 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6rsc\" (UniqueName: \"kubernetes.io/projected/9683823e-29da-45e3-a662-84320cc6a8aa-kube-api-access-w6rsc\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.008100 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9683823e-29da-45e3-a662-84320cc6a8aa-logs\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.008589 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9683823e-29da-45e3-a662-84320cc6a8aa-logs\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.008593 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.008875 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.008988 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg85x\" (UniqueName: \"kubernetes.io/projected/282e9515-3aa8-49a9-a752-253d7cdf6b9f-kube-api-access-mg85x\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.009078 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.009217 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/282e9515-3aa8-49a9-a752-253d7cdf6b9f-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.012348 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.012520 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.012615 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-config-data\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.026191 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6rsc\" (UniqueName: \"kubernetes.io/projected/9683823e-29da-45e3-a662-84320cc6a8aa-kube-api-access-w6rsc\") pod \"nova-metadata-0\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.118787 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.380465 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-8b49f" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.380647 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-8b49f" event={"ID":"282e9515-3aa8-49a9-a752-253d7cdf6b9f","Type":"ContainerDied","Data":"55ba86c092bdbfaddb268ae6541460acd777fbf137f718553ef1e481298d15bd"} Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.382591 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55ba86c092bdbfaddb268ae6541460acd777fbf137f718553ef1e481298d15bd" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.463355 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 15:42:30 crc kubenswrapper[4676]: E1204 15:42:30.463773 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="282e9515-3aa8-49a9-a752-253d7cdf6b9f" containerName="nova-cell1-conductor-db-sync" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.463789 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="282e9515-3aa8-49a9-a752-253d7cdf6b9f" containerName="nova-cell1-conductor-db-sync" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.464029 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="282e9515-3aa8-49a9-a752-253d7cdf6b9f" containerName="nova-cell1-conductor-db-sync" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.464828 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.472034 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.475409 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.692634 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c8dccd-8cfb-4b04-a035-e7af36e48038-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.692724 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxg78\" (UniqueName: \"kubernetes.io/projected/f4c8dccd-8cfb-4b04-a035-e7af36e48038-kube-api-access-xxg78\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.692758 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c8dccd-8cfb-4b04-a035-e7af36e48038-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.758872 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.794422 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c8dccd-8cfb-4b04-a035-e7af36e48038-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.794494 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxg78\" (UniqueName: \"kubernetes.io/projected/f4c8dccd-8cfb-4b04-a035-e7af36e48038-kube-api-access-xxg78\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.794544 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c8dccd-8cfb-4b04-a035-e7af36e48038-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.800783 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c8dccd-8cfb-4b04-a035-e7af36e48038-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.800950 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c8dccd-8cfb-4b04-a035-e7af36e48038-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:30 crc kubenswrapper[4676]: I1204 15:42:30.814965 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxg78\" (UniqueName: \"kubernetes.io/projected/f4c8dccd-8cfb-4b04-a035-e7af36e48038-kube-api-access-xxg78\") pod \"nova-cell1-conductor-0\" (UID: \"f4c8dccd-8cfb-4b04-a035-e7af36e48038\") " pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.060591 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.082895 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.201263 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8468d903-d218-42b8-9621-6ec64ee2a7f9-logs\") pod \"8468d903-d218-42b8-9621-6ec64ee2a7f9\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.201330 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-config-data\") pod \"8468d903-d218-42b8-9621-6ec64ee2a7f9\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.201365 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hjq7\" (UniqueName: \"kubernetes.io/projected/8468d903-d218-42b8-9621-6ec64ee2a7f9-kube-api-access-5hjq7\") pod \"8468d903-d218-42b8-9621-6ec64ee2a7f9\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.201587 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-combined-ca-bundle\") pod \"8468d903-d218-42b8-9621-6ec64ee2a7f9\" (UID: \"8468d903-d218-42b8-9621-6ec64ee2a7f9\") " Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.202042 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8468d903-d218-42b8-9621-6ec64ee2a7f9-logs" (OuterVolumeSpecName: "logs") pod "8468d903-d218-42b8-9621-6ec64ee2a7f9" (UID: "8468d903-d218-42b8-9621-6ec64ee2a7f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.205849 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8468d903-d218-42b8-9621-6ec64ee2a7f9-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.211007 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8468d903-d218-42b8-9621-6ec64ee2a7f9-kube-api-access-5hjq7" (OuterVolumeSpecName: "kube-api-access-5hjq7") pod "8468d903-d218-42b8-9621-6ec64ee2a7f9" (UID: "8468d903-d218-42b8-9621-6ec64ee2a7f9"). InnerVolumeSpecName "kube-api-access-5hjq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.240254 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-config-data" (OuterVolumeSpecName: "config-data") pod "8468d903-d218-42b8-9621-6ec64ee2a7f9" (UID: "8468d903-d218-42b8-9621-6ec64ee2a7f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.242178 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8468d903-d218-42b8-9621-6ec64ee2a7f9" (UID: "8468d903-d218-42b8-9621-6ec64ee2a7f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.273781 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.310095 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-config-data\") pod \"09e1cddd-f35d-4e93-9331-429675aa4275\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.310257 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-combined-ca-bundle\") pod \"09e1cddd-f35d-4e93-9331-429675aa4275\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.310416 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vft4l\" (UniqueName: \"kubernetes.io/projected/09e1cddd-f35d-4e93-9331-429675aa4275-kube-api-access-vft4l\") pod \"09e1cddd-f35d-4e93-9331-429675aa4275\" (UID: \"09e1cddd-f35d-4e93-9331-429675aa4275\") " Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.310970 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.310990 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8468d903-d218-42b8-9621-6ec64ee2a7f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.311002 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hjq7\" (UniqueName: \"kubernetes.io/projected/8468d903-d218-42b8-9621-6ec64ee2a7f9-kube-api-access-5hjq7\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.315665 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e1cddd-f35d-4e93-9331-429675aa4275-kube-api-access-vft4l" (OuterVolumeSpecName: "kube-api-access-vft4l") pod "09e1cddd-f35d-4e93-9331-429675aa4275" (UID: "09e1cddd-f35d-4e93-9331-429675aa4275"). InnerVolumeSpecName "kube-api-access-vft4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.355349 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-config-data" (OuterVolumeSpecName: "config-data") pod "09e1cddd-f35d-4e93-9331-429675aa4275" (UID: "09e1cddd-f35d-4e93-9331-429675aa4275"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.392257 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09e1cddd-f35d-4e93-9331-429675aa4275" (UID: "09e1cddd-f35d-4e93-9331-429675aa4275"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.402306 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c35412-68e4-4dd6-9ddf-3a72053bb40f" path="/var/lib/kubelet/pods/21c35412-68e4-4dd6-9ddf-3a72053bb40f/volumes" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.412292 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vft4l\" (UniqueName: \"kubernetes.io/projected/09e1cddd-f35d-4e93-9331-429675aa4275-kube-api-access-vft4l\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.412320 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.412330 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09e1cddd-f35d-4e93-9331-429675aa4275-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.419456 4676 generic.go:334] "Generic (PLEG): container finished" podID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerID="e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac" exitCode=0 Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.419517 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.419531 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8468d903-d218-42b8-9621-6ec64ee2a7f9","Type":"ContainerDied","Data":"e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac"} Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.419637 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8468d903-d218-42b8-9621-6ec64ee2a7f9","Type":"ContainerDied","Data":"535946f01a55da8dfff70b594522dd891e2b4fb8896e61ef610ca637ccf7b12c"} Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.419659 4676 scope.go:117] "RemoveContainer" containerID="e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.424620 4676 generic.go:334] "Generic (PLEG): container finished" podID="09e1cddd-f35d-4e93-9331-429675aa4275" containerID="78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" exitCode=0 Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.424713 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09e1cddd-f35d-4e93-9331-429675aa4275","Type":"ContainerDied","Data":"78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a"} Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.424765 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09e1cddd-f35d-4e93-9331-429675aa4275","Type":"ContainerDied","Data":"ce7f29dc159161bc9f4e8ebb83da8c730c9d8288e5f2a02f2bca2ad7f095ec53"} Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.424849 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.438166 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9683823e-29da-45e3-a662-84320cc6a8aa","Type":"ContainerStarted","Data":"22ecceff848869a7e272bc1d9110808800cf62936c047d32b7d82a51664e3beb"} Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.438362 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9683823e-29da-45e3-a662-84320cc6a8aa","Type":"ContainerStarted","Data":"cd19c79b7fea5c3e8ad01f10f0adfd134cc1d2555699c3dbb9b736b5be86cdcd"} Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.454182 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.457544 4676 scope.go:117] "RemoveContainer" containerID="7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.463021 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.472507 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: E1204 15:42:31.473370 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e1cddd-f35d-4e93-9331-429675aa4275" containerName="nova-scheduler-scheduler" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.473787 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e1cddd-f35d-4e93-9331-429675aa4275" containerName="nova-scheduler-scheduler" Dec 04 15:42:31 crc kubenswrapper[4676]: E1204 15:42:31.473822 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-api" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.473828 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-api" Dec 04 15:42:31 crc kubenswrapper[4676]: E1204 15:42:31.473841 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-log" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.473847 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-log" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.474114 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="09e1cddd-f35d-4e93-9331-429675aa4275" containerName="nova-scheduler-scheduler" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.474139 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-log" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.474149 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" containerName="nova-api-api" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.474726 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4747091709999998 podStartE2EDuration="2.474709171s" podCreationTimestamp="2025-12-04 15:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:31.466616606 +0000 UTC m=+1358.901286473" watchObservedRunningTime="2025-12-04 15:42:31.474709171 +0000 UTC m=+1358.909379028" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.475300 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.478255 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.495364 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.508709 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.514362 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-config-data\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.514590 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkzwd\" (UniqueName: \"kubernetes.io/projected/4f4617a5-dca9-4ae0-976a-989d8c8d047d-kube-api-access-vkzwd\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.514627 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.514707 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4617a5-dca9-4ae0-976a-989d8c8d047d-logs\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.517979 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.524079 4676 scope.go:117] "RemoveContainer" containerID="e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac" Dec 04 15:42:31 crc kubenswrapper[4676]: E1204 15:42:31.525522 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac\": container with ID starting with e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac not found: ID does not exist" containerID="e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.525557 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac"} err="failed to get container status \"e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac\": rpc error: code = NotFound desc = could not find container \"e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac\": container with ID starting with e4c0b85f915c948e2a41ccc604d84505b4875f352dd3697c0aa33ac152dbe4ac not found: ID does not exist" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.525583 4676 scope.go:117] "RemoveContainer" containerID="7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd" Dec 04 15:42:31 crc kubenswrapper[4676]: E1204 15:42:31.525896 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd\": container with ID starting with 7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd not found: ID does not exist" containerID="7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.525933 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd"} err="failed to get container status \"7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd\": rpc error: code = NotFound desc = could not find container \"7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd\": container with ID starting with 7aa42a6ef2edd57455ea6b92a3e8d77edd91417374ab74359f57f21169811abd not found: ID does not exist" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.525947 4676 scope.go:117] "RemoveContainer" containerID="78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.535435 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.543329 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.547479 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.551660 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.576176 4676 scope.go:117] "RemoveContainer" containerID="78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" Dec 04 15:42:31 crc kubenswrapper[4676]: E1204 15:42:31.576825 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a\": container with ID starting with 78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a not found: ID does not exist" containerID="78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.576873 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a"} err="failed to get container status \"78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a\": rpc error: code = NotFound desc = could not find container \"78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a\": container with ID starting with 78cc54a18e707be02696916cf9bd9adbc7ab011a33b0e24961435121a5147a9a not found: ID does not exist" Dec 04 15:42:31 crc kubenswrapper[4676]: W1204 15:42:31.600000 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4c8dccd_8cfb_4b04_a035_e7af36e48038.slice/crio-b3105c08956d68b046afa2bc05bc4392bc7fb2d97ef353ba5fb24e70eb098b1f WatchSource:0}: Error finding container b3105c08956d68b046afa2bc05bc4392bc7fb2d97ef353ba5fb24e70eb098b1f: Status 404 returned error can't find the container with id b3105c08956d68b046afa2bc05bc4392bc7fb2d97ef353ba5fb24e70eb098b1f Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.600726 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.616466 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkzwd\" (UniqueName: \"kubernetes.io/projected/4f4617a5-dca9-4ae0-976a-989d8c8d047d-kube-api-access-vkzwd\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.616507 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.616551 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzgxs\" (UniqueName: \"kubernetes.io/projected/33792424-6952-4280-9589-83aeb894841e-kube-api-access-lzgxs\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.616693 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4617a5-dca9-4ae0-976a-989d8c8d047d-logs\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.616761 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-config-data\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.616802 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-config-data\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.616846 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.618154 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4617a5-dca9-4ae0-976a-989d8c8d047d-logs\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.621513 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-config-data\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.630325 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.636132 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkzwd\" (UniqueName: \"kubernetes.io/projected/4f4617a5-dca9-4ae0-976a-989d8c8d047d-kube-api-access-vkzwd\") pod \"nova-api-0\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.718477 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzgxs\" (UniqueName: \"kubernetes.io/projected/33792424-6952-4280-9589-83aeb894841e-kube-api-access-lzgxs\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.718598 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-config-data\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.718673 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.722641 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.726088 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-config-data\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.733673 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzgxs\" (UniqueName: \"kubernetes.io/projected/33792424-6952-4280-9589-83aeb894841e-kube-api-access-lzgxs\") pod \"nova-scheduler-0\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " pod="openstack/nova-scheduler-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.805250 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:31 crc kubenswrapper[4676]: I1204 15:42:31.863636 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.405123 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.421871 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.458547 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9683823e-29da-45e3-a662-84320cc6a8aa","Type":"ContainerStarted","Data":"ec30475a9a69e1763b10181527f9f74232e50079a86b1bb6a9f9a882548167ab"} Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.460679 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4617a5-dca9-4ae0-976a-989d8c8d047d","Type":"ContainerStarted","Data":"400669dabade0967304638c6be5726f3572bc8c3982e03a7f19540c45ef1a782"} Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.462826 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f4c8dccd-8cfb-4b04-a035-e7af36e48038","Type":"ContainerStarted","Data":"27da5f57952e48fd6a7c5117c075b099a2034fcdbea66b4c1b7f96da45ca827c"} Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.462883 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f4c8dccd-8cfb-4b04-a035-e7af36e48038","Type":"ContainerStarted","Data":"b3105c08956d68b046afa2bc05bc4392bc7fb2d97ef353ba5fb24e70eb098b1f"} Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.466417 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33792424-6952-4280-9589-83aeb894841e","Type":"ContainerStarted","Data":"b762eee2583702fba0dbda79e0422398c49206fa28c1487e068bfbd27496cc94"} Dec 04 15:42:32 crc kubenswrapper[4676]: I1204 15:42:32.491108 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.491088962 podStartE2EDuration="2.491088962s" podCreationTimestamp="2025-12-04 15:42:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:32.489187747 +0000 UTC m=+1359.923857624" watchObservedRunningTime="2025-12-04 15:42:32.491088962 +0000 UTC m=+1359.925758839" Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.408675 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09e1cddd-f35d-4e93-9331-429675aa4275" path="/var/lib/kubelet/pods/09e1cddd-f35d-4e93-9331-429675aa4275/volumes" Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.409530 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8468d903-d218-42b8-9621-6ec64ee2a7f9" path="/var/lib/kubelet/pods/8468d903-d218-42b8-9621-6ec64ee2a7f9/volumes" Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.479547 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33792424-6952-4280-9589-83aeb894841e","Type":"ContainerStarted","Data":"02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8"} Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.483856 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4617a5-dca9-4ae0-976a-989d8c8d047d","Type":"ContainerStarted","Data":"6af2c1e8a13076783ab19cb9ffd786ada0bbdf9e51fe5eeacc07140d3cf37fb6"} Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.483894 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.483920 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4617a5-dca9-4ae0-976a-989d8c8d047d","Type":"ContainerStarted","Data":"36e3a5392018915650b55e5e99eeaa5ae92048b527cfd85a76e14cb024d56772"} Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.497821 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.497801023 podStartE2EDuration="2.497801023s" podCreationTimestamp="2025-12-04 15:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:33.493675324 +0000 UTC m=+1360.928345191" watchObservedRunningTime="2025-12-04 15:42:33.497801023 +0000 UTC m=+1360.932470880" Dec 04 15:42:33 crc kubenswrapper[4676]: I1204 15:42:33.525102 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.525078474 podStartE2EDuration="2.525078474s" podCreationTimestamp="2025-12-04 15:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:33.516379652 +0000 UTC m=+1360.951049509" watchObservedRunningTime="2025-12-04 15:42:33.525078474 +0000 UTC m=+1360.959748331" Dec 04 15:42:35 crc kubenswrapper[4676]: I1204 15:42:35.119728 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 15:42:35 crc kubenswrapper[4676]: I1204 15:42:35.120066 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 15:42:36 crc kubenswrapper[4676]: I1204 15:42:36.142662 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 04 15:42:36 crc kubenswrapper[4676]: I1204 15:42:36.864362 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 15:42:40 crc kubenswrapper[4676]: I1204 15:42:40.119655 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 15:42:40 crc kubenswrapper[4676]: I1204 15:42:40.120267 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 15:42:41 crc kubenswrapper[4676]: I1204 15:42:41.127202 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:42:41 crc kubenswrapper[4676]: I1204 15:42:41.134258 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:42:41 crc kubenswrapper[4676]: I1204 15:42:41.805890 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:42:41 crc kubenswrapper[4676]: I1204 15:42:41.806039 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:42:41 crc kubenswrapper[4676]: I1204 15:42:41.864863 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 15:42:41 crc kubenswrapper[4676]: I1204 15:42:41.913809 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 15:42:42 crc kubenswrapper[4676]: I1204 15:42:42.642307 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 15:42:42 crc kubenswrapper[4676]: I1204 15:42:42.887142 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:42:42 crc kubenswrapper[4676]: I1204 15:42:42.887142 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.214:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.582591 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.690755 4676 generic.go:334] "Generic (PLEG): container finished" podID="fc5ec209-3e74-4d87-ba5f-d84052dd2c32" containerID="9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3" exitCode=137 Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.690851 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.690860 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc5ec209-3e74-4d87-ba5f-d84052dd2c32","Type":"ContainerDied","Data":"9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3"} Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.691271 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fc5ec209-3e74-4d87-ba5f-d84052dd2c32","Type":"ContainerDied","Data":"3749cf14963d6dad41d9c9242141823e535bbb9c28cbc4fc14b41179c883d22a"} Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.691295 4676 scope.go:117] "RemoveContainer" containerID="9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.732406 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-config-data\") pod \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.732526 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-combined-ca-bundle\") pod \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.732586 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4nhv\" (UniqueName: \"kubernetes.io/projected/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-kube-api-access-w4nhv\") pod \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\" (UID: \"fc5ec209-3e74-4d87-ba5f-d84052dd2c32\") " Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.740565 4676 scope.go:117] "RemoveContainer" containerID="9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3" Dec 04 15:42:48 crc kubenswrapper[4676]: E1204 15:42:48.741424 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3\": container with ID starting with 9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3 not found: ID does not exist" containerID="9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.741563 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3"} err="failed to get container status \"9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3\": rpc error: code = NotFound desc = could not find container \"9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3\": container with ID starting with 9f34d820778b5c88fdad1d2feedb632bf60ec7ba5105dc145949ba851cae9ee3 not found: ID does not exist" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.741798 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-kube-api-access-w4nhv" (OuterVolumeSpecName: "kube-api-access-w4nhv") pod "fc5ec209-3e74-4d87-ba5f-d84052dd2c32" (UID: "fc5ec209-3e74-4d87-ba5f-d84052dd2c32"). InnerVolumeSpecName "kube-api-access-w4nhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.760223 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-config-data" (OuterVolumeSpecName: "config-data") pod "fc5ec209-3e74-4d87-ba5f-d84052dd2c32" (UID: "fc5ec209-3e74-4d87-ba5f-d84052dd2c32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.762061 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc5ec209-3e74-4d87-ba5f-d84052dd2c32" (UID: "fc5ec209-3e74-4d87-ba5f-d84052dd2c32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.835760 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.835814 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4nhv\" (UniqueName: \"kubernetes.io/projected/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-kube-api-access-w4nhv\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.835832 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc5ec209-3e74-4d87-ba5f-d84052dd2c32-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:48 crc kubenswrapper[4676]: I1204 15:42:48.907950 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.042247 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.052983 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.085989 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:49 crc kubenswrapper[4676]: E1204 15:42:49.086554 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5ec209-3e74-4d87-ba5f-d84052dd2c32" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.086577 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5ec209-3e74-4d87-ba5f-d84052dd2c32" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.086798 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5ec209-3e74-4d87-ba5f-d84052dd2c32" containerName="nova-cell1-novncproxy-novncproxy" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.087606 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.093740 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.093952 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.094154 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.108624 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.143879 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.144122 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jlx\" (UniqueName: \"kubernetes.io/projected/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-kube-api-access-r4jlx\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.144170 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.144223 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.144377 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.246402 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jlx\" (UniqueName: \"kubernetes.io/projected/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-kube-api-access-r4jlx\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.246716 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.246788 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.246884 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.246959 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.251053 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.251324 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.251542 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.261314 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.267411 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jlx\" (UniqueName: \"kubernetes.io/projected/0081333b-fdf2-4cc5-924c-3d1ad7dc0419-kube-api-access-r4jlx\") pod \"nova-cell1-novncproxy-0\" (UID: \"0081333b-fdf2-4cc5-924c-3d1ad7dc0419\") " pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.396523 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5ec209-3e74-4d87-ba5f-d84052dd2c32" path="/var/lib/kubelet/pods/fc5ec209-3e74-4d87-ba5f-d84052dd2c32/volumes" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.433058 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:49 crc kubenswrapper[4676]: I1204 15:42:49.919233 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 04 15:42:49 crc kubenswrapper[4676]: W1204 15:42:49.919951 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0081333b_fdf2_4cc5_924c_3d1ad7dc0419.slice/crio-59ba45b90462db762aed6881a4921aa6652f792e454a1e81e369632548e766b1 WatchSource:0}: Error finding container 59ba45b90462db762aed6881a4921aa6652f792e454a1e81e369632548e766b1: Status 404 returned error can't find the container with id 59ba45b90462db762aed6881a4921aa6652f792e454a1e81e369632548e766b1 Dec 04 15:42:50 crc kubenswrapper[4676]: I1204 15:42:50.126423 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 15:42:50 crc kubenswrapper[4676]: I1204 15:42:50.135967 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 15:42:50 crc kubenswrapper[4676]: I1204 15:42:50.149246 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 15:42:50 crc kubenswrapper[4676]: I1204 15:42:50.724352 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0081333b-fdf2-4cc5-924c-3d1ad7dc0419","Type":"ContainerStarted","Data":"c2bc17a17c7fcfe90a2da968858d3a7e9fb5c5efba693ad2bd38d9d9bbeb737f"} Dec 04 15:42:50 crc kubenswrapper[4676]: I1204 15:42:50.724410 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0081333b-fdf2-4cc5-924c-3d1ad7dc0419","Type":"ContainerStarted","Data":"59ba45b90462db762aed6881a4921aa6652f792e454a1e81e369632548e766b1"} Dec 04 15:42:50 crc kubenswrapper[4676]: I1204 15:42:50.736035 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 15:42:50 crc kubenswrapper[4676]: I1204 15:42:50.741055 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.741040773 podStartE2EDuration="1.741040773s" podCreationTimestamp="2025-12-04 15:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:50.739860929 +0000 UTC m=+1378.174530796" watchObservedRunningTime="2025-12-04 15:42:50.741040773 +0000 UTC m=+1378.175710630" Dec 04 15:42:51 crc kubenswrapper[4676]: I1204 15:42:51.815302 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 15:42:51 crc kubenswrapper[4676]: I1204 15:42:51.815954 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 15:42:51 crc kubenswrapper[4676]: I1204 15:42:51.821395 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 15:42:51 crc kubenswrapper[4676]: I1204 15:42:51.822447 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 15:42:52 crc kubenswrapper[4676]: I1204 15:42:52.742722 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 15:42:52 crc kubenswrapper[4676]: I1204 15:42:52.750299 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 15:42:52 crc kubenswrapper[4676]: I1204 15:42:52.958205 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d658544b9-r5sxw"] Dec 04 15:42:52 crc kubenswrapper[4676]: I1204 15:42:52.960827 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:52 crc kubenswrapper[4676]: I1204 15:42:52.981315 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d658544b9-r5sxw"] Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.088396 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.088512 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.088580 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.088621 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-svc\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.088714 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-config\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.088745 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4ww\" (UniqueName: \"kubernetes.io/projected/5e9e8792-ee83-463a-be59-f11e4eaa78e0-kube-api-access-wz4ww\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.190839 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-config\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.190895 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4ww\" (UniqueName: \"kubernetes.io/projected/5e9e8792-ee83-463a-be59-f11e4eaa78e0-kube-api-access-wz4ww\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.190978 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.191050 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.191115 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.191165 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-svc\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.192212 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-sb\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.192293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-swift-storage-0\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.192298 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-config\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.193131 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-svc\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.193180 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-nb\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.215725 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4ww\" (UniqueName: \"kubernetes.io/projected/5e9e8792-ee83-463a-be59-f11e4eaa78e0-kube-api-access-wz4ww\") pod \"dnsmasq-dns-5d658544b9-r5sxw\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.293645 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:53 crc kubenswrapper[4676]: I1204 15:42:53.793116 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d658544b9-r5sxw"] Dec 04 15:42:54 crc kubenswrapper[4676]: I1204 15:42:54.434035 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:54 crc kubenswrapper[4676]: I1204 15:42:54.821705 4676 generic.go:334] "Generic (PLEG): container finished" podID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerID="8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90" exitCode=0 Dec 04 15:42:54 crc kubenswrapper[4676]: I1204 15:42:54.821757 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" event={"ID":"5e9e8792-ee83-463a-be59-f11e4eaa78e0","Type":"ContainerDied","Data":"8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90"} Dec 04 15:42:54 crc kubenswrapper[4676]: I1204 15:42:54.822070 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" event={"ID":"5e9e8792-ee83-463a-be59-f11e4eaa78e0","Type":"ContainerStarted","Data":"ca924f3bc8887fad574489ee51d48de4f579b2a9835a2efb4719ccbf87ad193d"} Dec 04 15:42:55 crc kubenswrapper[4676]: I1204 15:42:55.833995 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" event={"ID":"5e9e8792-ee83-463a-be59-f11e4eaa78e0","Type":"ContainerStarted","Data":"648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695"} Dec 04 15:42:55 crc kubenswrapper[4676]: I1204 15:42:55.834290 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:42:55 crc kubenswrapper[4676]: I1204 15:42:55.857170 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" podStartSLOduration=3.8571455009999998 podStartE2EDuration="3.857145501s" podCreationTimestamp="2025-12-04 15:42:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:42:55.849697815 +0000 UTC m=+1383.284367682" watchObservedRunningTime="2025-12-04 15:42:55.857145501 +0000 UTC m=+1383.291815358" Dec 04 15:42:55 crc kubenswrapper[4676]: I1204 15:42:55.907320 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:55 crc kubenswrapper[4676]: I1204 15:42:55.907584 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-log" containerID="cri-o://36e3a5392018915650b55e5e99eeaa5ae92048b527cfd85a76e14cb024d56772" gracePeriod=30 Dec 04 15:42:55 crc kubenswrapper[4676]: I1204 15:42:55.907630 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-api" containerID="cri-o://6af2c1e8a13076783ab19cb9ffd786ada0bbdf9e51fe5eeacc07140d3cf37fb6" gracePeriod=30 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.180950 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.181635 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="proxy-httpd" containerID="cri-o://281eaacb3a89151d12b17dea8acebed832c5cb09033ded40057fec56ff6a05a5" gracePeriod=30 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.181701 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-notification-agent" containerID="cri-o://1adc45a5d71cebb78915732286b10a517cb3922fcca22a36b3088503fe202639" gracePeriod=30 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.181635 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="sg-core" containerID="cri-o://04227894183ed8d0beb68429ee581220ad0e5bdfa734ac33c5803dc160c87425" gracePeriod=30 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.181565 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-central-agent" containerID="cri-o://53c496fbeeec7bc1df17ed072378d4c91583dadc7ef052f9d71135e5f817c369" gracePeriod=30 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.846145 4676 generic.go:334] "Generic (PLEG): container finished" podID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerID="36e3a5392018915650b55e5e99eeaa5ae92048b527cfd85a76e14cb024d56772" exitCode=143 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.846185 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4617a5-dca9-4ae0-976a-989d8c8d047d","Type":"ContainerDied","Data":"36e3a5392018915650b55e5e99eeaa5ae92048b527cfd85a76e14cb024d56772"} Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.849679 4676 generic.go:334] "Generic (PLEG): container finished" podID="6387383e-d39c-4e26-b204-1fedb37707b0" containerID="281eaacb3a89151d12b17dea8acebed832c5cb09033ded40057fec56ff6a05a5" exitCode=0 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.849701 4676 generic.go:334] "Generic (PLEG): container finished" podID="6387383e-d39c-4e26-b204-1fedb37707b0" containerID="04227894183ed8d0beb68429ee581220ad0e5bdfa734ac33c5803dc160c87425" exitCode=2 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.849710 4676 generic.go:334] "Generic (PLEG): container finished" podID="6387383e-d39c-4e26-b204-1fedb37707b0" containerID="53c496fbeeec7bc1df17ed072378d4c91583dadc7ef052f9d71135e5f817c369" exitCode=0 Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.849720 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerDied","Data":"281eaacb3a89151d12b17dea8acebed832c5cb09033ded40057fec56ff6a05a5"} Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.849776 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerDied","Data":"04227894183ed8d0beb68429ee581220ad0e5bdfa734ac33c5803dc160c87425"} Dec 04 15:42:56 crc kubenswrapper[4676]: I1204 15:42:56.849793 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerDied","Data":"53c496fbeeec7bc1df17ed072378d4c91583dadc7ef052f9d71135e5f817c369"} Dec 04 15:42:57 crc kubenswrapper[4676]: I1204 15:42:57.892800 4676 generic.go:334] "Generic (PLEG): container finished" podID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerID="6af2c1e8a13076783ab19cb9ffd786ada0bbdf9e51fe5eeacc07140d3cf37fb6" exitCode=0 Dec 04 15:42:57 crc kubenswrapper[4676]: I1204 15:42:57.892836 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4617a5-dca9-4ae0-976a-989d8c8d047d","Type":"ContainerDied","Data":"6af2c1e8a13076783ab19cb9ffd786ada0bbdf9e51fe5eeacc07140d3cf37fb6"} Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.272488 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.393311 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4617a5-dca9-4ae0-976a-989d8c8d047d-logs\") pod \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.394052 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-combined-ca-bundle\") pod \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.394165 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkzwd\" (UniqueName: \"kubernetes.io/projected/4f4617a5-dca9-4ae0-976a-989d8c8d047d-kube-api-access-vkzwd\") pod \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.394219 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-config-data\") pod \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\" (UID: \"4f4617a5-dca9-4ae0-976a-989d8c8d047d\") " Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.394497 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f4617a5-dca9-4ae0-976a-989d8c8d047d-logs" (OuterVolumeSpecName: "logs") pod "4f4617a5-dca9-4ae0-976a-989d8c8d047d" (UID: "4f4617a5-dca9-4ae0-976a-989d8c8d047d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.395138 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f4617a5-dca9-4ae0-976a-989d8c8d047d-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.404147 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4617a5-dca9-4ae0-976a-989d8c8d047d-kube-api-access-vkzwd" (OuterVolumeSpecName: "kube-api-access-vkzwd") pod "4f4617a5-dca9-4ae0-976a-989d8c8d047d" (UID: "4f4617a5-dca9-4ae0-976a-989d8c8d047d"). InnerVolumeSpecName "kube-api-access-vkzwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.434156 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-config-data" (OuterVolumeSpecName: "config-data") pod "4f4617a5-dca9-4ae0-976a-989d8c8d047d" (UID: "4f4617a5-dca9-4ae0-976a-989d8c8d047d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.472638 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f4617a5-dca9-4ae0-976a-989d8c8d047d" (UID: "4f4617a5-dca9-4ae0-976a-989d8c8d047d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.498725 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.498761 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkzwd\" (UniqueName: \"kubernetes.io/projected/4f4617a5-dca9-4ae0-976a-989d8c8d047d-kube-api-access-vkzwd\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.498774 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f4617a5-dca9-4ae0-976a-989d8c8d047d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.906330 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f4617a5-dca9-4ae0-976a-989d8c8d047d","Type":"ContainerDied","Data":"400669dabade0967304638c6be5726f3572bc8c3982e03a7f19540c45ef1a782"} Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.906408 4676 scope.go:117] "RemoveContainer" containerID="6af2c1e8a13076783ab19cb9ffd786ada0bbdf9e51fe5eeacc07140d3cf37fb6" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.907022 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.932828 4676 scope.go:117] "RemoveContainer" containerID="36e3a5392018915650b55e5e99eeaa5ae92048b527cfd85a76e14cb024d56772" Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.954088 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:58 crc kubenswrapper[4676]: I1204 15:42:58.984935 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.001987 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:59 crc kubenswrapper[4676]: E1204 15:42:59.002752 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-log" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.002773 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-log" Dec 04 15:42:59 crc kubenswrapper[4676]: E1204 15:42:59.002812 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-api" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.002820 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-api" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.003062 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-log" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.003097 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" containerName="nova-api-api" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.004491 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.008281 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.008484 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.008597 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.010567 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.118369 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vht4k\" (UniqueName: \"kubernetes.io/projected/41ecdd72-d01b-46f2-b6c6-cafe592037bb-kube-api-access-vht4k\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.118432 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.118682 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.118751 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-config-data\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.119052 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.119195 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ecdd72-d01b-46f2-b6c6-cafe592037bb-logs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.220480 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.220534 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-config-data\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.220623 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.220674 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ecdd72-d01b-46f2-b6c6-cafe592037bb-logs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.220705 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vht4k\" (UniqueName: \"kubernetes.io/projected/41ecdd72-d01b-46f2-b6c6-cafe592037bb-kube-api-access-vht4k\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.220720 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.221602 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ecdd72-d01b-46f2-b6c6-cafe592037bb-logs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.224687 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.225232 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-config-data\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.226066 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-public-tls-certs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.227261 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.246130 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vht4k\" (UniqueName: \"kubernetes.io/projected/41ecdd72-d01b-46f2-b6c6-cafe592037bb-kube-api-access-vht4k\") pod \"nova-api-0\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.322236 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.413617 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4617a5-dca9-4ae0-976a-989d8c8d047d" path="/var/lib/kubelet/pods/4f4617a5-dca9-4ae0-976a-989d8c8d047d/volumes" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.434318 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.546338 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.934222 4676 generic.go:334] "Generic (PLEG): container finished" podID="6387383e-d39c-4e26-b204-1fedb37707b0" containerID="1adc45a5d71cebb78915732286b10a517cb3922fcca22a36b3088503fe202639" exitCode=0 Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.934312 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerDied","Data":"1adc45a5d71cebb78915732286b10a517cb3922fcca22a36b3088503fe202639"} Dec 04 15:42:59 crc kubenswrapper[4676]: I1204 15:42:59.962562 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.197086 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.213795 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-wfvln"] Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.215218 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.220301 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.220586 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.255792 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.268671 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wfvln"] Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.362600 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-run-httpd\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.362984 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-scripts\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363075 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-config-data\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363134 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-sg-core-conf-yaml\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363217 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-combined-ca-bundle\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363245 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-ceilometer-tls-certs\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363252 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363277 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgc7c\" (UniqueName: \"kubernetes.io/projected/6387383e-d39c-4e26-b204-1fedb37707b0-kube-api-access-xgc7c\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363335 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-log-httpd\") pod \"6387383e-d39c-4e26-b204-1fedb37707b0\" (UID: \"6387383e-d39c-4e26-b204-1fedb37707b0\") " Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363561 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv2qm\" (UniqueName: \"kubernetes.io/projected/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-kube-api-access-vv2qm\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363616 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-scripts\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363655 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-config-data\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.363748 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.364453 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.365596 4676 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.365628 4676 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6387383e-d39c-4e26-b204-1fedb37707b0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.373041 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-scripts" (OuterVolumeSpecName: "scripts") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.377191 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6387383e-d39c-4e26-b204-1fedb37707b0-kube-api-access-xgc7c" (OuterVolumeSpecName: "kube-api-access-xgc7c") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "kube-api-access-xgc7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.449010 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.467095 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.467226 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv2qm\" (UniqueName: \"kubernetes.io/projected/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-kube-api-access-vv2qm\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.467263 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-scripts\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.467297 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-config-data\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.467364 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgc7c\" (UniqueName: \"kubernetes.io/projected/6387383e-d39c-4e26-b204-1fedb37707b0-kube-api-access-xgc7c\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.467378 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.467389 4676 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.471423 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-config-data\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.472009 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.474830 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-scripts\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.490159 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.491243 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.491946 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv2qm\" (UniqueName: \"kubernetes.io/projected/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-kube-api-access-vv2qm\") pod \"nova-cell1-cell-mapping-wfvln\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.541569 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-config-data" (OuterVolumeSpecName: "config-data") pod "6387383e-d39c-4e26-b204-1fedb37707b0" (UID: "6387383e-d39c-4e26-b204-1fedb37707b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.626097 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.626133 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.626145 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6387383e-d39c-4e26-b204-1fedb37707b0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.632132 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.958394 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6387383e-d39c-4e26-b204-1fedb37707b0","Type":"ContainerDied","Data":"abbf5f27a34c6a1ef9d66701df78576b922ac79a21a77569cfb3ad1305989ef0"} Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.958699 4676 scope.go:117] "RemoveContainer" containerID="281eaacb3a89151d12b17dea8acebed832c5cb09033ded40057fec56ff6a05a5" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.958810 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.978696 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ecdd72-d01b-46f2-b6c6-cafe592037bb","Type":"ContainerStarted","Data":"7b68aed09cf5800a911bc132263ab370dfb66e28965f4bfc832fd1127ff115ed"} Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.978727 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ecdd72-d01b-46f2-b6c6-cafe592037bb","Type":"ContainerStarted","Data":"65a62ab8b2bbe20cceccec6a0fda381c5e09d1c1cb6133596de64c6abeafec91"} Dec 04 15:43:00 crc kubenswrapper[4676]: I1204 15:43:00.978739 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ecdd72-d01b-46f2-b6c6-cafe592037bb","Type":"ContainerStarted","Data":"81960d268903d03dff76ebb70675699fa04450ebc73850c2f61ba19dd84cf935"} Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.029898 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.048164 4676 scope.go:117] "RemoveContainer" containerID="04227894183ed8d0beb68429ee581220ad0e5bdfa734ac33c5803dc160c87425" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.052655 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.066722 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.06669575 podStartE2EDuration="3.06669575s" podCreationTimestamp="2025-12-04 15:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:43:01.043768045 +0000 UTC m=+1388.478437902" watchObservedRunningTime="2025-12-04 15:43:01.06669575 +0000 UTC m=+1388.501365627" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.079944 4676 scope.go:117] "RemoveContainer" containerID="1adc45a5d71cebb78915732286b10a517cb3922fcca22a36b3088503fe202639" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.092031 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:43:01 crc kubenswrapper[4676]: E1204 15:43:01.092607 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-central-agent" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.092631 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-central-agent" Dec 04 15:43:01 crc kubenswrapper[4676]: E1204 15:43:01.092656 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="proxy-httpd" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.092665 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="proxy-httpd" Dec 04 15:43:01 crc kubenswrapper[4676]: E1204 15:43:01.092686 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="sg-core" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.092696 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="sg-core" Dec 04 15:43:01 crc kubenswrapper[4676]: E1204 15:43:01.092716 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-notification-agent" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.092725 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-notification-agent" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.093003 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="proxy-httpd" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.093041 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="sg-core" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.093057 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-central-agent" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.093083 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" containerName="ceilometer-notification-agent" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.095609 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.099600 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.101818 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.104789 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.105770 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.186948 4676 scope.go:117] "RemoveContainer" containerID="53c496fbeeec7bc1df17ed072378d4c91583dadc7ef052f9d71135e5f817c369" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.189585 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-wfvln"] Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242425 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/920f3ae5-c94b-486c-b387-6774d1e29587-run-httpd\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242488 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242517 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/920f3ae5-c94b-486c-b387-6774d1e29587-log-httpd\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242638 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242736 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljtnf\" (UniqueName: \"kubernetes.io/projected/920f3ae5-c94b-486c-b387-6774d1e29587-kube-api-access-ljtnf\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242790 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-config-data\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242830 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.242865 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-scripts\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.344682 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/920f3ae5-c94b-486c-b387-6774d1e29587-run-httpd\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.344742 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.344771 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/920f3ae5-c94b-486c-b387-6774d1e29587-log-httpd\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.344869 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.344967 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljtnf\" (UniqueName: \"kubernetes.io/projected/920f3ae5-c94b-486c-b387-6774d1e29587-kube-api-access-ljtnf\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.345000 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-config-data\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.345030 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.345064 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-scripts\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.345471 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/920f3ae5-c94b-486c-b387-6774d1e29587-run-httpd\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.348416 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/920f3ae5-c94b-486c-b387-6774d1e29587-log-httpd\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.350386 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.351266 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-config-data\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.353849 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-scripts\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.355837 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.364733 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/920f3ae5-c94b-486c-b387-6774d1e29587-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.364816 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljtnf\" (UniqueName: \"kubernetes.io/projected/920f3ae5-c94b-486c-b387-6774d1e29587-kube-api-access-ljtnf\") pod \"ceilometer-0\" (UID: \"920f3ae5-c94b-486c-b387-6774d1e29587\") " pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.399048 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6387383e-d39c-4e26-b204-1fedb37707b0" path="/var/lib/kubelet/pods/6387383e-d39c-4e26-b204-1fedb37707b0/volumes" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.480642 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 04 15:43:01 crc kubenswrapper[4676]: I1204 15:43:01.995069 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 04 15:43:02 crc kubenswrapper[4676]: W1204 15:43:02.000401 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod920f3ae5_c94b_486c_b387_6774d1e29587.slice/crio-0c9a275d097b5246b951ea07572bfdafe11fd6e65d9084a874086e902e76723f WatchSource:0}: Error finding container 0c9a275d097b5246b951ea07572bfdafe11fd6e65d9084a874086e902e76723f: Status 404 returned error can't find the container with id 0c9a275d097b5246b951ea07572bfdafe11fd6e65d9084a874086e902e76723f Dec 04 15:43:02 crc kubenswrapper[4676]: I1204 15:43:02.000401 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wfvln" event={"ID":"add0c0ae-e35b-47c2-b4f3-15af24cd97bf","Type":"ContainerStarted","Data":"e430f27031e3208c1416a3a4c8552d7a026ac0f1ec4c0f9d880cdd8d2a124fb5"} Dec 04 15:43:02 crc kubenswrapper[4676]: I1204 15:43:02.000452 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wfvln" event={"ID":"add0c0ae-e35b-47c2-b4f3-15af24cd97bf","Type":"ContainerStarted","Data":"68e29f86a9caa6943bb31d9a04e53313d936824c106e27eb8a7df4c404d2516a"} Dec 04 15:43:02 crc kubenswrapper[4676]: I1204 15:43:02.021604 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-wfvln" podStartSLOduration=2.021581489 podStartE2EDuration="2.021581489s" podCreationTimestamp="2025-12-04 15:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:43:02.01681282 +0000 UTC m=+1389.451482677" watchObservedRunningTime="2025-12-04 15:43:02.021581489 +0000 UTC m=+1389.456251346" Dec 04 15:43:03 crc kubenswrapper[4676]: I1204 15:43:03.011183 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"920f3ae5-c94b-486c-b387-6774d1e29587","Type":"ContainerStarted","Data":"8783caa21272205acb7535badcd5d5b1b89f9300642beb5ac134c26780dbc842"} Dec 04 15:43:03 crc kubenswrapper[4676]: I1204 15:43:03.011427 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"920f3ae5-c94b-486c-b387-6774d1e29587","Type":"ContainerStarted","Data":"0c9a275d097b5246b951ea07572bfdafe11fd6e65d9084a874086e902e76723f"} Dec 04 15:43:03 crc kubenswrapper[4676]: I1204 15:43:03.295957 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:43:03 crc kubenswrapper[4676]: I1204 15:43:03.430136 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568844f8ff-tk8hd"] Dec 04 15:43:03 crc kubenswrapper[4676]: I1204 15:43:03.430423 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerName="dnsmasq-dns" containerID="cri-o://66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c" gracePeriod=10 Dec 04 15:43:03 crc kubenswrapper[4676]: I1204 15:43:03.640270 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.208:5353: connect: connection refused" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.018639 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.024796 4676 generic.go:334] "Generic (PLEG): container finished" podID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerID="66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c" exitCode=0 Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.024849 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" event={"ID":"5a9e7336-af8d-48d4-82a4-3631cb57ecc8","Type":"ContainerDied","Data":"66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c"} Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.024876 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" event={"ID":"5a9e7336-af8d-48d4-82a4-3631cb57ecc8","Type":"ContainerDied","Data":"9407256b2495c0defca965c5dcfec1f7df79882b1ba59115fe7c7f3a1bebf82a"} Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.024894 4676 scope.go:117] "RemoveContainer" containerID="66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.025872 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-568844f8ff-tk8hd" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.044370 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"920f3ae5-c94b-486c-b387-6774d1e29587","Type":"ContainerStarted","Data":"727f196452864479b3f4701930674387eac8aef063754c2e65e3eca42e5a0a2a"} Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.044459 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"920f3ae5-c94b-486c-b387-6774d1e29587","Type":"ContainerStarted","Data":"ffd3a1c58a6cfbea754e600a1d9d69dc412248ecf55c313a663817a672c0a37d"} Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.082466 4676 scope.go:117] "RemoveContainer" containerID="99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.203309 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-config\") pod \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.203450 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-nb\") pod \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.203517 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62kkg\" (UniqueName: \"kubernetes.io/projected/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-kube-api-access-62kkg\") pod \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.203639 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-sb\") pod \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.203701 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-swift-storage-0\") pod \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.203718 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-svc\") pod \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\" (UID: \"5a9e7336-af8d-48d4-82a4-3631cb57ecc8\") " Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.209463 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-kube-api-access-62kkg" (OuterVolumeSpecName: "kube-api-access-62kkg") pod "5a9e7336-af8d-48d4-82a4-3631cb57ecc8" (UID: "5a9e7336-af8d-48d4-82a4-3631cb57ecc8"). InnerVolumeSpecName "kube-api-access-62kkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.270343 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5a9e7336-af8d-48d4-82a4-3631cb57ecc8" (UID: "5a9e7336-af8d-48d4-82a4-3631cb57ecc8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.278683 4676 scope.go:117] "RemoveContainer" containerID="66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.279398 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a9e7336-af8d-48d4-82a4-3631cb57ecc8" (UID: "5a9e7336-af8d-48d4-82a4-3631cb57ecc8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:04 crc kubenswrapper[4676]: E1204 15:43:04.279498 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c\": container with ID starting with 66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c not found: ID does not exist" containerID="66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.279540 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c"} err="failed to get container status \"66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c\": rpc error: code = NotFound desc = could not find container \"66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c\": container with ID starting with 66928b167326350af612c2e17a75024aca37a1107942c6df90b51bc80acd9e0c not found: ID does not exist" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.279567 4676 scope.go:117] "RemoveContainer" containerID="99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.279592 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-config" (OuterVolumeSpecName: "config") pod "5a9e7336-af8d-48d4-82a4-3631cb57ecc8" (UID: "5a9e7336-af8d-48d4-82a4-3631cb57ecc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:04 crc kubenswrapper[4676]: E1204 15:43:04.279888 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420\": container with ID starting with 99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420 not found: ID does not exist" containerID="99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.279930 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420"} err="failed to get container status \"99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420\": rpc error: code = NotFound desc = could not find container \"99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420\": container with ID starting with 99a99a2f434f8c96fa6ddccb162d1687d026b0641b479f08fa7516c4d41f9420 not found: ID does not exist" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.305482 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a9e7336-af8d-48d4-82a4-3631cb57ecc8" (UID: "5a9e7336-af8d-48d4-82a4-3631cb57ecc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.306606 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.306651 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.306661 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.306671 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62kkg\" (UniqueName: \"kubernetes.io/projected/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-kube-api-access-62kkg\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.306679 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.308639 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a9e7336-af8d-48d4-82a4-3631cb57ecc8" (UID: "5a9e7336-af8d-48d4-82a4-3631cb57ecc8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.409984 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-568844f8ff-tk8hd"] Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.414673 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a9e7336-af8d-48d4-82a4-3631cb57ecc8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:04 crc kubenswrapper[4676]: I1204 15:43:04.419181 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-568844f8ff-tk8hd"] Dec 04 15:43:05 crc kubenswrapper[4676]: I1204 15:43:05.395012 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" path="/var/lib/kubelet/pods/5a9e7336-af8d-48d4-82a4-3631cb57ecc8/volumes" Dec 04 15:43:06 crc kubenswrapper[4676]: I1204 15:43:06.070145 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"920f3ae5-c94b-486c-b387-6774d1e29587","Type":"ContainerStarted","Data":"512774ac8c4394a00afbec8c30874096cd2c85220f4e4ac8df1f468e23effd23"} Dec 04 15:43:06 crc kubenswrapper[4676]: I1204 15:43:06.070333 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 04 15:43:06 crc kubenswrapper[4676]: I1204 15:43:06.157579 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.252460391 podStartE2EDuration="5.157551477s" podCreationTimestamp="2025-12-04 15:43:01 +0000 UTC" firstStartedPulling="2025-12-04 15:43:02.004336049 +0000 UTC m=+1389.439005906" lastFinishedPulling="2025-12-04 15:43:04.909427135 +0000 UTC m=+1392.344096992" observedRunningTime="2025-12-04 15:43:06.137335301 +0000 UTC m=+1393.572005158" watchObservedRunningTime="2025-12-04 15:43:06.157551477 +0000 UTC m=+1393.592221334" Dec 04 15:43:08 crc kubenswrapper[4676]: I1204 15:43:08.094079 4676 generic.go:334] "Generic (PLEG): container finished" podID="add0c0ae-e35b-47c2-b4f3-15af24cd97bf" containerID="e430f27031e3208c1416a3a4c8552d7a026ac0f1ec4c0f9d880cdd8d2a124fb5" exitCode=0 Dec 04 15:43:08 crc kubenswrapper[4676]: I1204 15:43:08.094130 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wfvln" event={"ID":"add0c0ae-e35b-47c2-b4f3-15af24cd97bf","Type":"ContainerDied","Data":"e430f27031e3208c1416a3a4c8552d7a026ac0f1ec4c0f9d880cdd8d2a124fb5"} Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.323528 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.323813 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.550898 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.644833 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-combined-ca-bundle\") pod \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.644939 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv2qm\" (UniqueName: \"kubernetes.io/projected/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-kube-api-access-vv2qm\") pod \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.645041 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-config-data\") pod \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.645224 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-scripts\") pod \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\" (UID: \"add0c0ae-e35b-47c2-b4f3-15af24cd97bf\") " Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.650533 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-scripts" (OuterVolumeSpecName: "scripts") pod "add0c0ae-e35b-47c2-b4f3-15af24cd97bf" (UID: "add0c0ae-e35b-47c2-b4f3-15af24cd97bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.652323 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-kube-api-access-vv2qm" (OuterVolumeSpecName: "kube-api-access-vv2qm") pod "add0c0ae-e35b-47c2-b4f3-15af24cd97bf" (UID: "add0c0ae-e35b-47c2-b4f3-15af24cd97bf"). InnerVolumeSpecName "kube-api-access-vv2qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.677769 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "add0c0ae-e35b-47c2-b4f3-15af24cd97bf" (UID: "add0c0ae-e35b-47c2-b4f3-15af24cd97bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.687061 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-config-data" (OuterVolumeSpecName: "config-data") pod "add0c0ae-e35b-47c2-b4f3-15af24cd97bf" (UID: "add0c0ae-e35b-47c2-b4f3-15af24cd97bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.747739 4676 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.747782 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.747796 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv2qm\" (UniqueName: \"kubernetes.io/projected/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-kube-api-access-vv2qm\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:09 crc kubenswrapper[4676]: I1204 15:43:09.747808 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/add0c0ae-e35b-47c2-b4f3-15af24cd97bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.121019 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-wfvln" event={"ID":"add0c0ae-e35b-47c2-b4f3-15af24cd97bf","Type":"ContainerDied","Data":"68e29f86a9caa6943bb31d9a04e53313d936824c106e27eb8a7df4c404d2516a"} Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.121089 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e29f86a9caa6943bb31d9a04e53313d936824c106e27eb8a7df4c404d2516a" Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.121180 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-wfvln" Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.317262 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.317584 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-log" containerID="cri-o://65a62ab8b2bbe20cceccec6a0fda381c5e09d1c1cb6133596de64c6abeafec91" gracePeriod=30 Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.318219 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-api" containerID="cri-o://7b68aed09cf5800a911bc132263ab370dfb66e28965f4bfc832fd1127ff115ed" gracePeriod=30 Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.325211 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.325460 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="33792424-6952-4280-9589-83aeb894841e" containerName="nova-scheduler-scheduler" containerID="cri-o://02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8" gracePeriod=30 Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.331029 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.331321 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.354537 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.354814 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-log" containerID="cri-o://22ecceff848869a7e272bc1d9110808800cf62936c047d32b7d82a51664e3beb" gracePeriod=30 Dec 04 15:43:10 crc kubenswrapper[4676]: I1204 15:43:10.355422 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-metadata" containerID="cri-o://ec30475a9a69e1763b10181527f9f74232e50079a86b1bb6a9f9a882548167ab" gracePeriod=30 Dec 04 15:43:10 crc kubenswrapper[4676]: E1204 15:43:10.523463 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9683823e_29da_45e3_a662_84320cc6a8aa.slice/crio-conmon-22ecceff848869a7e272bc1d9110808800cf62936c047d32b7d82a51664e3beb.scope\": RecentStats: unable to find data in memory cache]" Dec 04 15:43:11 crc kubenswrapper[4676]: I1204 15:43:11.138313 4676 generic.go:334] "Generic (PLEG): container finished" podID="9683823e-29da-45e3-a662-84320cc6a8aa" containerID="22ecceff848869a7e272bc1d9110808800cf62936c047d32b7d82a51664e3beb" exitCode=143 Dec 04 15:43:11 crc kubenswrapper[4676]: I1204 15:43:11.138376 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9683823e-29da-45e3-a662-84320cc6a8aa","Type":"ContainerDied","Data":"22ecceff848869a7e272bc1d9110808800cf62936c047d32b7d82a51664e3beb"} Dec 04 15:43:11 crc kubenswrapper[4676]: I1204 15:43:11.142436 4676 generic.go:334] "Generic (PLEG): container finished" podID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerID="65a62ab8b2bbe20cceccec6a0fda381c5e09d1c1cb6133596de64c6abeafec91" exitCode=143 Dec 04 15:43:11 crc kubenswrapper[4676]: I1204 15:43:11.142479 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ecdd72-d01b-46f2-b6c6-cafe592037bb","Type":"ContainerDied","Data":"65a62ab8b2bbe20cceccec6a0fda381c5e09d1c1cb6133596de64c6abeafec91"} Dec 04 15:43:11 crc kubenswrapper[4676]: E1204 15:43:11.866571 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 15:43:11 crc kubenswrapper[4676]: E1204 15:43:11.868293 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 15:43:11 crc kubenswrapper[4676]: E1204 15:43:11.869749 4676 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 04 15:43:11 crc kubenswrapper[4676]: E1204 15:43:11.869785 4676 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="33792424-6952-4280-9589-83aeb894841e" containerName="nova-scheduler-scheduler" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.158399 4676 generic.go:334] "Generic (PLEG): container finished" podID="9683823e-29da-45e3-a662-84320cc6a8aa" containerID="ec30475a9a69e1763b10181527f9f74232e50079a86b1bb6a9f9a882548167ab" exitCode=0 Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.158449 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9683823e-29da-45e3-a662-84320cc6a8aa","Type":"ContainerDied","Data":"ec30475a9a69e1763b10181527f9f74232e50079a86b1bb6a9f9a882548167ab"} Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.530973 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.612427 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6rsc\" (UniqueName: \"kubernetes.io/projected/9683823e-29da-45e3-a662-84320cc6a8aa-kube-api-access-w6rsc\") pod \"9683823e-29da-45e3-a662-84320cc6a8aa\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.612553 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-config-data\") pod \"9683823e-29da-45e3-a662-84320cc6a8aa\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.612680 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9683823e-29da-45e3-a662-84320cc6a8aa-logs\") pod \"9683823e-29da-45e3-a662-84320cc6a8aa\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.612721 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-combined-ca-bundle\") pod \"9683823e-29da-45e3-a662-84320cc6a8aa\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.612889 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-nova-metadata-tls-certs\") pod \"9683823e-29da-45e3-a662-84320cc6a8aa\" (UID: \"9683823e-29da-45e3-a662-84320cc6a8aa\") " Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.615085 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9683823e-29da-45e3-a662-84320cc6a8aa-logs" (OuterVolumeSpecName: "logs") pod "9683823e-29da-45e3-a662-84320cc6a8aa" (UID: "9683823e-29da-45e3-a662-84320cc6a8aa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.618793 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9683823e-29da-45e3-a662-84320cc6a8aa-kube-api-access-w6rsc" (OuterVolumeSpecName: "kube-api-access-w6rsc") pod "9683823e-29da-45e3-a662-84320cc6a8aa" (UID: "9683823e-29da-45e3-a662-84320cc6a8aa"). InnerVolumeSpecName "kube-api-access-w6rsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.645716 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-config-data" (OuterVolumeSpecName: "config-data") pod "9683823e-29da-45e3-a662-84320cc6a8aa" (UID: "9683823e-29da-45e3-a662-84320cc6a8aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.647436 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9683823e-29da-45e3-a662-84320cc6a8aa" (UID: "9683823e-29da-45e3-a662-84320cc6a8aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.685324 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9683823e-29da-45e3-a662-84320cc6a8aa" (UID: "9683823e-29da-45e3-a662-84320cc6a8aa"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.721378 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9683823e-29da-45e3-a662-84320cc6a8aa-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.721429 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.721485 4676 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.721500 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6rsc\" (UniqueName: \"kubernetes.io/projected/9683823e-29da-45e3-a662-84320cc6a8aa-kube-api-access-w6rsc\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:12 crc kubenswrapper[4676]: I1204 15:43:12.721514 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9683823e-29da-45e3-a662-84320cc6a8aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.210059 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9683823e-29da-45e3-a662-84320cc6a8aa","Type":"ContainerDied","Data":"cd19c79b7fea5c3e8ad01f10f0adfd134cc1d2555699c3dbb9b736b5be86cdcd"} Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.210109 4676 scope.go:117] "RemoveContainer" containerID="ec30475a9a69e1763b10181527f9f74232e50079a86b1bb6a9f9a882548167ab" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.210286 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.278076 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.298851 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.299156 4676 scope.go:117] "RemoveContainer" containerID="22ecceff848869a7e272bc1d9110808800cf62936c047d32b7d82a51664e3beb" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.310462 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:43:13 crc kubenswrapper[4676]: E1204 15:43:13.310892 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerName="init" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.310965 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerName="init" Dec 04 15:43:13 crc kubenswrapper[4676]: E1204 15:43:13.311000 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerName="dnsmasq-dns" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311010 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerName="dnsmasq-dns" Dec 04 15:43:13 crc kubenswrapper[4676]: E1204 15:43:13.311023 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-log" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311031 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-log" Dec 04 15:43:13 crc kubenswrapper[4676]: E1204 15:43:13.311042 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add0c0ae-e35b-47c2-b4f3-15af24cd97bf" containerName="nova-manage" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311058 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="add0c0ae-e35b-47c2-b4f3-15af24cd97bf" containerName="nova-manage" Dec 04 15:43:13 crc kubenswrapper[4676]: E1204 15:43:13.311069 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-metadata" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311074 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-metadata" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311290 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-metadata" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311312 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="add0c0ae-e35b-47c2-b4f3-15af24cd97bf" containerName="nova-manage" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311324 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a9e7336-af8d-48d4-82a4-3631cb57ecc8" containerName="dnsmasq-dns" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.311336 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" containerName="nova-metadata-log" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.312446 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.317172 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.317221 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.329089 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.342508 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.342611 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-config-data\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.342696 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-logs\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.342721 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.342780 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjg68\" (UniqueName: \"kubernetes.io/projected/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-kube-api-access-pjg68\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.396786 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9683823e-29da-45e3-a662-84320cc6a8aa" path="/var/lib/kubelet/pods/9683823e-29da-45e3-a662-84320cc6a8aa/volumes" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.444873 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.444973 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjg68\" (UniqueName: \"kubernetes.io/projected/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-kube-api-access-pjg68\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.445137 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.445189 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-config-data\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.445251 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-logs\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.445625 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-logs\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.449770 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.453875 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-config-data\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.456460 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.462735 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjg68\" (UniqueName: \"kubernetes.io/projected/e0794dc7-c796-4e57-bf9e-eefb1ac8e72c-kube-api-access-pjg68\") pod \"nova-metadata-0\" (UID: \"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c\") " pod="openstack/nova-metadata-0" Dec 04 15:43:13 crc kubenswrapper[4676]: I1204 15:43:13.633456 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 04 15:43:14 crc kubenswrapper[4676]: I1204 15:43:14.794089 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.240992 4676 generic.go:334] "Generic (PLEG): container finished" podID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerID="7b68aed09cf5800a911bc132263ab370dfb66e28965f4bfc832fd1127ff115ed" exitCode=0 Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.241052 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ecdd72-d01b-46f2-b6c6-cafe592037bb","Type":"ContainerDied","Data":"7b68aed09cf5800a911bc132263ab370dfb66e28965f4bfc832fd1127ff115ed"} Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.262802 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c","Type":"ContainerStarted","Data":"e6768f32d5da46c481d0ffd59cb52f21c586b13195a58993f02c7e1306ba1d34"} Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.262857 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c","Type":"ContainerStarted","Data":"44aa5347343f1c8b5cf24be0ab8bb079aa6550026c331d85d4acdcf0269d7d98"} Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.415670 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.492454 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ecdd72-d01b-46f2-b6c6-cafe592037bb-logs\") pod \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.492509 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vht4k\" (UniqueName: \"kubernetes.io/projected/41ecdd72-d01b-46f2-b6c6-cafe592037bb-kube-api-access-vht4k\") pod \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.492594 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-public-tls-certs\") pod \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.492633 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-combined-ca-bundle\") pod \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.492665 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-config-data\") pod \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.492781 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-internal-tls-certs\") pod \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\" (UID: \"41ecdd72-d01b-46f2-b6c6-cafe592037bb\") " Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.493059 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ecdd72-d01b-46f2-b6c6-cafe592037bb-logs" (OuterVolumeSpecName: "logs") pod "41ecdd72-d01b-46f2-b6c6-cafe592037bb" (UID: "41ecdd72-d01b-46f2-b6c6-cafe592037bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.494127 4676 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41ecdd72-d01b-46f2-b6c6-cafe592037bb-logs\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.502161 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41ecdd72-d01b-46f2-b6c6-cafe592037bb-kube-api-access-vht4k" (OuterVolumeSpecName: "kube-api-access-vht4k") pod "41ecdd72-d01b-46f2-b6c6-cafe592037bb" (UID: "41ecdd72-d01b-46f2-b6c6-cafe592037bb"). InnerVolumeSpecName "kube-api-access-vht4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.527261 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-config-data" (OuterVolumeSpecName: "config-data") pod "41ecdd72-d01b-46f2-b6c6-cafe592037bb" (UID: "41ecdd72-d01b-46f2-b6c6-cafe592037bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.533993 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41ecdd72-d01b-46f2-b6c6-cafe592037bb" (UID: "41ecdd72-d01b-46f2-b6c6-cafe592037bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.554144 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "41ecdd72-d01b-46f2-b6c6-cafe592037bb" (UID: "41ecdd72-d01b-46f2-b6c6-cafe592037bb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.555623 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "41ecdd72-d01b-46f2-b6c6-cafe592037bb" (UID: "41ecdd72-d01b-46f2-b6c6-cafe592037bb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.596391 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vht4k\" (UniqueName: \"kubernetes.io/projected/41ecdd72-d01b-46f2-b6c6-cafe592037bb-kube-api-access-vht4k\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.596462 4676 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.596478 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.596490 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:15 crc kubenswrapper[4676]: I1204 15:43:15.596502 4676 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/41ecdd72-d01b-46f2-b6c6-cafe592037bb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.282116 4676 generic.go:334] "Generic (PLEG): container finished" podID="33792424-6952-4280-9589-83aeb894841e" containerID="02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8" exitCode=0 Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.282196 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33792424-6952-4280-9589-83aeb894841e","Type":"ContainerDied","Data":"02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8"} Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.284749 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e0794dc7-c796-4e57-bf9e-eefb1ac8e72c","Type":"ContainerStarted","Data":"f72e74c227793ba068b2c07dbd33c89c19de620b24e08bb70e2527af7b3a21a5"} Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.287399 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"41ecdd72-d01b-46f2-b6c6-cafe592037bb","Type":"ContainerDied","Data":"81960d268903d03dff76ebb70675699fa04450ebc73850c2f61ba19dd84cf935"} Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.287443 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.287455 4676 scope.go:117] "RemoveContainer" containerID="7b68aed09cf5800a911bc132263ab370dfb66e28965f4bfc832fd1127ff115ed" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.321820 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.321796632 podStartE2EDuration="3.321796632s" podCreationTimestamp="2025-12-04 15:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:43:16.308669421 +0000 UTC m=+1403.743339278" watchObservedRunningTime="2025-12-04 15:43:16.321796632 +0000 UTC m=+1403.756466489" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.326762 4676 scope.go:117] "RemoveContainer" containerID="65a62ab8b2bbe20cceccec6a0fda381c5e09d1c1cb6133596de64c6abeafec91" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.358050 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.376662 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.392104 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 04 15:43:16 crc kubenswrapper[4676]: E1204 15:43:16.392561 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-api" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.392582 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-api" Dec 04 15:43:16 crc kubenswrapper[4676]: E1204 15:43:16.392633 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-log" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.392640 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-log" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.392877 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-log" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.392966 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" containerName="nova-api-api" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.395690 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.404448 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.404630 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.405249 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.411439 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.513679 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68wp\" (UniqueName: \"kubernetes.io/projected/2199ce2b-f085-4ad8-8048-d13b4399ff13-kube-api-access-f68wp\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.513805 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2199ce2b-f085-4ad8-8048-d13b4399ff13-logs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.513837 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-config-data\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.513925 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.513950 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-public-tls-certs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.513988 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.615708 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2199ce2b-f085-4ad8-8048-d13b4399ff13-logs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.616081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-config-data\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.616172 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.616215 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-public-tls-certs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.616262 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.616298 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2199ce2b-f085-4ad8-8048-d13b4399ff13-logs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.616322 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68wp\" (UniqueName: \"kubernetes.io/projected/2199ce2b-f085-4ad8-8048-d13b4399ff13-kube-api-access-f68wp\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.622040 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-public-tls-certs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.631168 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.631999 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-config-data\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.635246 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68wp\" (UniqueName: \"kubernetes.io/projected/2199ce2b-f085-4ad8-8048-d13b4399ff13-kube-api-access-f68wp\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.641922 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2199ce2b-f085-4ad8-8048-d13b4399ff13-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2199ce2b-f085-4ad8-8048-d13b4399ff13\") " pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.726461 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.732383 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.819693 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzgxs\" (UniqueName: \"kubernetes.io/projected/33792424-6952-4280-9589-83aeb894841e-kube-api-access-lzgxs\") pod \"33792424-6952-4280-9589-83aeb894841e\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.820045 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-combined-ca-bundle\") pod \"33792424-6952-4280-9589-83aeb894841e\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.820082 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-config-data\") pod \"33792424-6952-4280-9589-83aeb894841e\" (UID: \"33792424-6952-4280-9589-83aeb894841e\") " Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.824979 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33792424-6952-4280-9589-83aeb894841e-kube-api-access-lzgxs" (OuterVolumeSpecName: "kube-api-access-lzgxs") pod "33792424-6952-4280-9589-83aeb894841e" (UID: "33792424-6952-4280-9589-83aeb894841e"). InnerVolumeSpecName "kube-api-access-lzgxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.851109 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-config-data" (OuterVolumeSpecName: "config-data") pod "33792424-6952-4280-9589-83aeb894841e" (UID: "33792424-6952-4280-9589-83aeb894841e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.852780 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33792424-6952-4280-9589-83aeb894841e" (UID: "33792424-6952-4280-9589-83aeb894841e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.927555 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.927596 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33792424-6952-4280-9589-83aeb894841e-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:16 crc kubenswrapper[4676]: I1204 15:43:16.927609 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzgxs\" (UniqueName: \"kubernetes.io/projected/33792424-6952-4280-9589-83aeb894841e-kube-api-access-lzgxs\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.310988 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.312174 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"33792424-6952-4280-9589-83aeb894841e","Type":"ContainerDied","Data":"b762eee2583702fba0dbda79e0422398c49206fa28c1487e068bfbd27496cc94"} Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.312241 4676 scope.go:117] "RemoveContainer" containerID="02e73354143ff11b8452810bb77664b735067131b7bafe0f31fbd3c66bbe7cc8" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.317489 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 04 15:43:17 crc kubenswrapper[4676]: W1204 15:43:17.334597 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2199ce2b_f085_4ad8_8048_d13b4399ff13.slice/crio-50368f43a39b751869812621225f8556bbbb2c77886d60f774a092949ef148d5 WatchSource:0}: Error finding container 50368f43a39b751869812621225f8556bbbb2c77886d60f774a092949ef148d5: Status 404 returned error can't find the container with id 50368f43a39b751869812621225f8556bbbb2c77886d60f774a092949ef148d5 Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.360779 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.370074 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.377358 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:43:17 crc kubenswrapper[4676]: E1204 15:43:17.377750 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33792424-6952-4280-9589-83aeb894841e" containerName="nova-scheduler-scheduler" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.377766 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="33792424-6952-4280-9589-83aeb894841e" containerName="nova-scheduler-scheduler" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.377983 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="33792424-6952-4280-9589-83aeb894841e" containerName="nova-scheduler-scheduler" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.378631 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.414806 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.439296 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2865n\" (UniqueName: \"kubernetes.io/projected/bdf5ba9f-064d-481b-be8f-9682f56de62e-kube-api-access-2865n\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.439543 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf5ba9f-064d-481b-be8f-9682f56de62e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.439721 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf5ba9f-064d-481b-be8f-9682f56de62e-config-data\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.454866 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33792424-6952-4280-9589-83aeb894841e" path="/var/lib/kubelet/pods/33792424-6952-4280-9589-83aeb894841e/volumes" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.455702 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ecdd72-d01b-46f2-b6c6-cafe592037bb" path="/var/lib/kubelet/pods/41ecdd72-d01b-46f2-b6c6-cafe592037bb/volumes" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.456523 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.541595 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2865n\" (UniqueName: \"kubernetes.io/projected/bdf5ba9f-064d-481b-be8f-9682f56de62e-kube-api-access-2865n\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.541724 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf5ba9f-064d-481b-be8f-9682f56de62e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.541809 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf5ba9f-064d-481b-be8f-9682f56de62e-config-data\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.547298 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf5ba9f-064d-481b-be8f-9682f56de62e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.547380 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf5ba9f-064d-481b-be8f-9682f56de62e-config-data\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.561743 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2865n\" (UniqueName: \"kubernetes.io/projected/bdf5ba9f-064d-481b-be8f-9682f56de62e-kube-api-access-2865n\") pod \"nova-scheduler-0\" (UID: \"bdf5ba9f-064d-481b-be8f-9682f56de62e\") " pod="openstack/nova-scheduler-0" Dec 04 15:43:17 crc kubenswrapper[4676]: I1204 15:43:17.731259 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 04 15:43:18 crc kubenswrapper[4676]: I1204 15:43:18.190111 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 04 15:43:18 crc kubenswrapper[4676]: W1204 15:43:18.195241 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdf5ba9f_064d_481b_be8f_9682f56de62e.slice/crio-cf7cdd05a1fc1ebd6a132307f021b5888ed961b2f1b2e5ab553969c6686b5a7b WatchSource:0}: Error finding container cf7cdd05a1fc1ebd6a132307f021b5888ed961b2f1b2e5ab553969c6686b5a7b: Status 404 returned error can't find the container with id cf7cdd05a1fc1ebd6a132307f021b5888ed961b2f1b2e5ab553969c6686b5a7b Dec 04 15:43:18 crc kubenswrapper[4676]: I1204 15:43:18.333134 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bdf5ba9f-064d-481b-be8f-9682f56de62e","Type":"ContainerStarted","Data":"cf7cdd05a1fc1ebd6a132307f021b5888ed961b2f1b2e5ab553969c6686b5a7b"} Dec 04 15:43:18 crc kubenswrapper[4676]: I1204 15:43:18.334792 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2199ce2b-f085-4ad8-8048-d13b4399ff13","Type":"ContainerStarted","Data":"4ddd588aaf7d3276f9cb7ebda6dfc9d9729f45b90cefa89af3ff81bcc39ab652"} Dec 04 15:43:18 crc kubenswrapper[4676]: I1204 15:43:18.334852 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2199ce2b-f085-4ad8-8048-d13b4399ff13","Type":"ContainerStarted","Data":"076aff12b6292b119c4a9c1d625371f9e81128864b59c0cd940e0ea77389fa5d"} Dec 04 15:43:18 crc kubenswrapper[4676]: I1204 15:43:18.334864 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2199ce2b-f085-4ad8-8048-d13b4399ff13","Type":"ContainerStarted","Data":"50368f43a39b751869812621225f8556bbbb2c77886d60f774a092949ef148d5"} Dec 04 15:43:18 crc kubenswrapper[4676]: I1204 15:43:18.634679 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 15:43:18 crc kubenswrapper[4676]: I1204 15:43:18.634937 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 04 15:43:19 crc kubenswrapper[4676]: I1204 15:43:19.349965 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bdf5ba9f-064d-481b-be8f-9682f56de62e","Type":"ContainerStarted","Data":"da7c5abf6ada8247c737ccc833c50ed28624cdd725d96ef2a4c8cd8c65dceb27"} Dec 04 15:43:19 crc kubenswrapper[4676]: I1204 15:43:19.382164 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.382147061 podStartE2EDuration="2.382147061s" podCreationTimestamp="2025-12-04 15:43:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:43:19.378803224 +0000 UTC m=+1406.813473101" watchObservedRunningTime="2025-12-04 15:43:19.382147061 +0000 UTC m=+1406.816816918" Dec 04 15:43:19 crc kubenswrapper[4676]: I1204 15:43:19.411178 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.411156662 podStartE2EDuration="3.411156662s" podCreationTimestamp="2025-12-04 15:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:43:19.404180339 +0000 UTC m=+1406.838850196" watchObservedRunningTime="2025-12-04 15:43:19.411156662 +0000 UTC m=+1406.845826519" Dec 04 15:43:22 crc kubenswrapper[4676]: I1204 15:43:22.732448 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.054824 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jxzz4"] Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.057125 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.067414 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxzz4"] Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.157279 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-utilities\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.157353 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-catalog-content\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.157466 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbkg\" (UniqueName: \"kubernetes.io/projected/128691ed-9329-419a-9de6-83608e8f56e0-kube-api-access-ztbkg\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.259807 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-utilities\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.259878 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-catalog-content\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.259936 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbkg\" (UniqueName: \"kubernetes.io/projected/128691ed-9329-419a-9de6-83608e8f56e0-kube-api-access-ztbkg\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.260368 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-utilities\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.260480 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-catalog-content\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.285620 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbkg\" (UniqueName: \"kubernetes.io/projected/128691ed-9329-419a-9de6-83608e8f56e0-kube-api-access-ztbkg\") pod \"redhat-operators-jxzz4\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.379731 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.635337 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.635409 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 04 15:43:23 crc kubenswrapper[4676]: I1204 15:43:23.870642 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jxzz4"] Dec 04 15:43:23 crc kubenswrapper[4676]: W1204 15:43:23.872182 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod128691ed_9329_419a_9de6_83608e8f56e0.slice/crio-b894ff2446f2d949801c53b1fe6af1d075c8c40124d06f763a3165a9e55b0621 WatchSource:0}: Error finding container b894ff2446f2d949801c53b1fe6af1d075c8c40124d06f763a3165a9e55b0621: Status 404 returned error can't find the container with id b894ff2446f2d949801c53b1fe6af1d075c8c40124d06f763a3165a9e55b0621 Dec 04 15:43:24 crc kubenswrapper[4676]: I1204 15:43:24.408091 4676 generic.go:334] "Generic (PLEG): container finished" podID="128691ed-9329-419a-9de6-83608e8f56e0" containerID="213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23" exitCode=0 Dec 04 15:43:24 crc kubenswrapper[4676]: I1204 15:43:24.408192 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxzz4" event={"ID":"128691ed-9329-419a-9de6-83608e8f56e0","Type":"ContainerDied","Data":"213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23"} Dec 04 15:43:24 crc kubenswrapper[4676]: I1204 15:43:24.408236 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxzz4" event={"ID":"128691ed-9329-419a-9de6-83608e8f56e0","Type":"ContainerStarted","Data":"b894ff2446f2d949801c53b1fe6af1d075c8c40124d06f763a3165a9e55b0621"} Dec 04 15:43:24 crc kubenswrapper[4676]: I1204 15:43:24.647293 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e0794dc7-c796-4e57-bf9e-eefb1ac8e72c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:43:24 crc kubenswrapper[4676]: I1204 15:43:24.647695 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e0794dc7-c796-4e57-bf9e-eefb1ac8e72c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:43:26 crc kubenswrapper[4676]: I1204 15:43:26.460756 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxzz4" event={"ID":"128691ed-9329-419a-9de6-83608e8f56e0","Type":"ContainerStarted","Data":"424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7"} Dec 04 15:43:26 crc kubenswrapper[4676]: I1204 15:43:26.727331 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:43:26 crc kubenswrapper[4676]: I1204 15:43:26.727395 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 04 15:43:27 crc kubenswrapper[4676]: I1204 15:43:27.732667 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 04 15:43:27 crc kubenswrapper[4676]: I1204 15:43:27.743116 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2199ce2b-f085-4ad8-8048-d13b4399ff13" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:43:27 crc kubenswrapper[4676]: I1204 15:43:27.743116 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2199ce2b-f085-4ad8-8048-d13b4399ff13" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.222:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 04 15:43:27 crc kubenswrapper[4676]: I1204 15:43:27.775730 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 04 15:43:28 crc kubenswrapper[4676]: I1204 15:43:28.483566 4676 generic.go:334] "Generic (PLEG): container finished" podID="128691ed-9329-419a-9de6-83608e8f56e0" containerID="424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7" exitCode=0 Dec 04 15:43:28 crc kubenswrapper[4676]: I1204 15:43:28.483594 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxzz4" event={"ID":"128691ed-9329-419a-9de6-83608e8f56e0","Type":"ContainerDied","Data":"424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7"} Dec 04 15:43:28 crc kubenswrapper[4676]: I1204 15:43:28.535495 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 04 15:43:31 crc kubenswrapper[4676]: I1204 15:43:31.488532 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 04 15:43:31 crc kubenswrapper[4676]: I1204 15:43:31.537003 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxzz4" event={"ID":"128691ed-9329-419a-9de6-83608e8f56e0","Type":"ContainerStarted","Data":"9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7"} Dec 04 15:43:31 crc kubenswrapper[4676]: I1204 15:43:31.578017 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jxzz4" podStartSLOduration=2.818243694 podStartE2EDuration="8.577995595s" podCreationTimestamp="2025-12-04 15:43:23 +0000 UTC" firstStartedPulling="2025-12-04 15:43:24.409724932 +0000 UTC m=+1411.844394789" lastFinishedPulling="2025-12-04 15:43:30.169476833 +0000 UTC m=+1417.604146690" observedRunningTime="2025-12-04 15:43:31.569223681 +0000 UTC m=+1419.003893538" watchObservedRunningTime="2025-12-04 15:43:31.577995595 +0000 UTC m=+1419.012665452" Dec 04 15:43:33 crc kubenswrapper[4676]: I1204 15:43:33.380171 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:33 crc kubenswrapper[4676]: I1204 15:43:33.380476 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:33 crc kubenswrapper[4676]: I1204 15:43:33.642220 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 15:43:33 crc kubenswrapper[4676]: I1204 15:43:33.643588 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 04 15:43:33 crc kubenswrapper[4676]: I1204 15:43:33.647626 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 15:43:33 crc kubenswrapper[4676]: I1204 15:43:33.648501 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 04 15:43:34 crc kubenswrapper[4676]: I1204 15:43:34.431523 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jxzz4" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="registry-server" probeResult="failure" output=< Dec 04 15:43:34 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 15:43:34 crc kubenswrapper[4676]: > Dec 04 15:43:36 crc kubenswrapper[4676]: I1204 15:43:36.738015 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 15:43:36 crc kubenswrapper[4676]: I1204 15:43:36.738872 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 15:43:36 crc kubenswrapper[4676]: I1204 15:43:36.739059 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 04 15:43:36 crc kubenswrapper[4676]: I1204 15:43:36.758969 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 15:43:37 crc kubenswrapper[4676]: I1204 15:43:37.593739 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 04 15:43:37 crc kubenswrapper[4676]: I1204 15:43:37.618482 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 04 15:43:43 crc kubenswrapper[4676]: I1204 15:43:43.425464 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:43 crc kubenswrapper[4676]: I1204 15:43:43.476964 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:43 crc kubenswrapper[4676]: I1204 15:43:43.664558 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxzz4"] Dec 04 15:43:44 crc kubenswrapper[4676]: I1204 15:43:44.658669 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jxzz4" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="registry-server" containerID="cri-o://9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7" gracePeriod=2 Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.116281 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.226131 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-utilities\") pod \"128691ed-9329-419a-9de6-83608e8f56e0\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.226212 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-catalog-content\") pod \"128691ed-9329-419a-9de6-83608e8f56e0\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.226344 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztbkg\" (UniqueName: \"kubernetes.io/projected/128691ed-9329-419a-9de6-83608e8f56e0-kube-api-access-ztbkg\") pod \"128691ed-9329-419a-9de6-83608e8f56e0\" (UID: \"128691ed-9329-419a-9de6-83608e8f56e0\") " Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.227875 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-utilities" (OuterVolumeSpecName: "utilities") pod "128691ed-9329-419a-9de6-83608e8f56e0" (UID: "128691ed-9329-419a-9de6-83608e8f56e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.238146 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/128691ed-9329-419a-9de6-83608e8f56e0-kube-api-access-ztbkg" (OuterVolumeSpecName: "kube-api-access-ztbkg") pod "128691ed-9329-419a-9de6-83608e8f56e0" (UID: "128691ed-9329-419a-9de6-83608e8f56e0"). InnerVolumeSpecName "kube-api-access-ztbkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.330052 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.330093 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztbkg\" (UniqueName: \"kubernetes.io/projected/128691ed-9329-419a-9de6-83608e8f56e0-kube-api-access-ztbkg\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.338045 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "128691ed-9329-419a-9de6-83608e8f56e0" (UID: "128691ed-9329-419a-9de6-83608e8f56e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.432373 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/128691ed-9329-419a-9de6-83608e8f56e0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.617131 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.670640 4676 generic.go:334] "Generic (PLEG): container finished" podID="128691ed-9329-419a-9de6-83608e8f56e0" containerID="9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7" exitCode=0 Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.670685 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxzz4" event={"ID":"128691ed-9329-419a-9de6-83608e8f56e0","Type":"ContainerDied","Data":"9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7"} Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.670708 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jxzz4" event={"ID":"128691ed-9329-419a-9de6-83608e8f56e0","Type":"ContainerDied","Data":"b894ff2446f2d949801c53b1fe6af1d075c8c40124d06f763a3165a9e55b0621"} Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.670726 4676 scope.go:117] "RemoveContainer" containerID="9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.672013 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jxzz4" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.705822 4676 scope.go:117] "RemoveContainer" containerID="424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.726081 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jxzz4"] Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.737849 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jxzz4"] Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.804348 4676 scope.go:117] "RemoveContainer" containerID="213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.869637 4676 scope.go:117] "RemoveContainer" containerID="9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7" Dec 04 15:43:45 crc kubenswrapper[4676]: E1204 15:43:45.870252 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7\": container with ID starting with 9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7 not found: ID does not exist" containerID="9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.870295 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7"} err="failed to get container status \"9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7\": rpc error: code = NotFound desc = could not find container \"9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7\": container with ID starting with 9feb993bacb9639e59efb8f3026a9282686b0036e54ef3458ada1995e92c16f7 not found: ID does not exist" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.870324 4676 scope.go:117] "RemoveContainer" containerID="424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7" Dec 04 15:43:45 crc kubenswrapper[4676]: E1204 15:43:45.870589 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7\": container with ID starting with 424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7 not found: ID does not exist" containerID="424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.870615 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7"} err="failed to get container status \"424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7\": rpc error: code = NotFound desc = could not find container \"424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7\": container with ID starting with 424ada130bfdc835f1a869fe79dbbb34b3cbff27d62b6b7f0f9945db68e61be7 not found: ID does not exist" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.870630 4676 scope.go:117] "RemoveContainer" containerID="213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23" Dec 04 15:43:45 crc kubenswrapper[4676]: E1204 15:43:45.870852 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23\": container with ID starting with 213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23 not found: ID does not exist" containerID="213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23" Dec 04 15:43:45 crc kubenswrapper[4676]: I1204 15:43:45.870873 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23"} err="failed to get container status \"213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23\": rpc error: code = NotFound desc = could not find container \"213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23\": container with ID starting with 213121ab42ccf678bc14b0badff77f6f620290d34338d3788d5065753d74ec23 not found: ID does not exist" Dec 04 15:43:46 crc kubenswrapper[4676]: I1204 15:43:46.583024 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:43:47 crc kubenswrapper[4676]: I1204 15:43:47.396467 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="128691ed-9329-419a-9de6-83608e8f56e0" path="/var/lib/kubelet/pods/128691ed-9329-419a-9de6-83608e8f56e0/volumes" Dec 04 15:43:49 crc kubenswrapper[4676]: I1204 15:43:49.237245 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerName="rabbitmq" containerID="cri-o://dde28b06626f8149535cfc50ea66b8ee5915c6a25e62012e99bd3cb77d058d91" gracePeriod=604797 Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.112433 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerName="rabbitmq" containerID="cri-o://03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa" gracePeriod=604797 Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.729234 4676 generic.go:334] "Generic (PLEG): container finished" podID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerID="dde28b06626f8149535cfc50ea66b8ee5915c6a25e62012e99bd3cb77d058d91" exitCode=0 Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.729556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bfec4df-7119-489c-a2e8-17dddd0e5c1d","Type":"ContainerDied","Data":"dde28b06626f8149535cfc50ea66b8ee5915c6a25e62012e99bd3cb77d058d91"} Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.823322 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.937854 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-config-data\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.937926 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-erlang-cookie-secret\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938061 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-pod-info\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938150 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-tls\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938180 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz2vc\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-kube-api-access-sz2vc\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938211 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-plugins-conf\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938254 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-plugins\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938281 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-server-conf\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938314 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-confd\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938388 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-erlang-cookie\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.938423 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\" (UID: \"6bfec4df-7119-489c-a2e8-17dddd0e5c1d\") " Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.941972 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.942292 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.942734 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.947349 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.948867 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.949055 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.953365 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-kube-api-access-sz2vc" (OuterVolumeSpecName: "kube-api-access-sz2vc") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "kube-api-access-sz2vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.958089 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-pod-info" (OuterVolumeSpecName: "pod-info") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 15:43:50 crc kubenswrapper[4676]: I1204 15:43:50.980244 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-config-data" (OuterVolumeSpecName: "config-data") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.024606 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-server-conf" (OuterVolumeSpecName: "server-conf") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.040746 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz2vc\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-kube-api-access-sz2vc\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.040980 4676 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.041085 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.041207 4676 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.041283 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.041389 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.042556 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.042604 4676 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.042619 4676 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.042630 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.076726 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.082268 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6bfec4df-7119-489c-a2e8-17dddd0e5c1d" (UID: "6bfec4df-7119-489c-a2e8-17dddd0e5c1d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.149251 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6bfec4df-7119-489c-a2e8-17dddd0e5c1d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.149562 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.842790 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6bfec4df-7119-489c-a2e8-17dddd0e5c1d","Type":"ContainerDied","Data":"eba9024ff6b212171ba475bacce568c31c34c5f7f0101258262a4e0fc6b4fb76"} Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.844487 4676 scope.go:117] "RemoveContainer" containerID="dde28b06626f8149535cfc50ea66b8ee5915c6a25e62012e99bd3cb77d058d91" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.843806 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.882262 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.882420 4676 generic.go:334] "Generic (PLEG): container finished" podID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerID="03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa" exitCode=0 Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.882527 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"743292d4-f5a5-48cd-bcb0-63fb95ac6910","Type":"ContainerDied","Data":"03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa"} Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.882556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"743292d4-f5a5-48cd-bcb0-63fb95ac6910","Type":"ContainerDied","Data":"487f160e9670c758fa5fa69d1fdc5eb7d441fde0f1d868194152c72d5169f7cf"} Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.891123 4676 scope.go:117] "RemoveContainer" containerID="1e79cadee4110746d5dcc8072fd80203a89b940c26619c6972fe68e00666b3ab" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.914536 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.932464 4676 scope.go:117] "RemoveContainer" containerID="03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.933201 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.934168 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-confd\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.934582 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-config-data\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.934697 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-plugins-conf\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.934922 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743292d4-f5a5-48cd-bcb0-63fb95ac6910-erlang-cookie-secret\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.935049 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-plugins\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.935214 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqs4z\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-kube-api-access-mqs4z\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.935291 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-tls\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.943459 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.945198 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.954429 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.957653 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-kube-api-access-mqs4z" (OuterVolumeSpecName: "kube-api-access-mqs4z") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "kube-api-access-mqs4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.961148 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743292d4-f5a5-48cd-bcb0-63fb95ac6910-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.972955 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:43:51 crc kubenswrapper[4676]: E1204 15:43:51.973500 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="extract-content" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973517 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="extract-content" Dec 04 15:43:51 crc kubenswrapper[4676]: E1204 15:43:51.973535 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerName="setup-container" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973542 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerName="setup-container" Dec 04 15:43:51 crc kubenswrapper[4676]: E1204 15:43:51.973554 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="registry-server" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973559 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="registry-server" Dec 04 15:43:51 crc kubenswrapper[4676]: E1204 15:43:51.973567 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerName="rabbitmq" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973573 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerName="rabbitmq" Dec 04 15:43:51 crc kubenswrapper[4676]: E1204 15:43:51.973588 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerName="setup-container" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973594 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerName="setup-container" Dec 04 15:43:51 crc kubenswrapper[4676]: E1204 15:43:51.973600 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerName="rabbitmq" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973608 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerName="rabbitmq" Dec 04 15:43:51 crc kubenswrapper[4676]: E1204 15:43:51.973621 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="extract-utilities" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973627 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="extract-utilities" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973813 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="128691ed-9329-419a-9de6-83608e8f56e0" containerName="registry-server" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973826 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" containerName="rabbitmq" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.973844 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" containerName="rabbitmq" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.975075 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.999370 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 04 15:43:51 crc kubenswrapper[4676]: I1204 15:43:51.999985 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.000269 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g2s2x" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.000439 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.000587 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.017404 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.017694 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.054962 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.058098 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743292d4-f5a5-48cd-bcb0-63fb95ac6910-pod-info\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.058880 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-erlang-cookie\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.059316 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-server-conf\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.059738 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.059809 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\" (UID: \"743292d4-f5a5-48cd-bcb0-63fb95ac6910\") " Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.061263 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-config-data" (OuterVolumeSpecName: "config-data") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.061407 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/743292d4-f5a5-48cd-bcb0-63fb95ac6910-pod-info" (OuterVolumeSpecName: "pod-info") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.069438 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b2812cb-4bae-4379-89af-005c5629b8f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.069484 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.069701 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.069862 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b2812cb-4bae-4379-89af-005c5629b8f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.069998 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070163 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m48t\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-kube-api-access-7m48t\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070233 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070293 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070317 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070387 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070520 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070598 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070611 4676 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070622 4676 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/743292d4-f5a5-48cd-bcb0-63fb95ac6910-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070632 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070661 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqs4z\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-kube-api-access-mqs4z\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070671 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070679 4676 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/743292d4-f5a5-48cd-bcb0-63fb95ac6910-pod-info\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.070687 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.098879 4676 scope.go:117] "RemoveContainer" containerID="a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.101495 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.148616 4676 scope.go:117] "RemoveContainer" containerID="03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa" Dec 04 15:43:52 crc kubenswrapper[4676]: E1204 15:43:52.149722 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa\": container with ID starting with 03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa not found: ID does not exist" containerID="03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.149765 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa"} err="failed to get container status \"03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa\": rpc error: code = NotFound desc = could not find container \"03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa\": container with ID starting with 03341c437891fd969a8cde459afcf6f59366fd7ef3fdbecdb328e686a6c37aaa not found: ID does not exist" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.149794 4676 scope.go:117] "RemoveContainer" containerID="a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c" Dec 04 15:43:52 crc kubenswrapper[4676]: E1204 15:43:52.150285 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c\": container with ID starting with a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c not found: ID does not exist" containerID="a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.150314 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c"} err="failed to get container status \"a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c\": rpc error: code = NotFound desc = could not find container \"a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c\": container with ID starting with a645738992576e9660a8167d136b55f77b87e0533bc2860db115278c9e89293c not found: ID does not exist" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.161399 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-server-conf" (OuterVolumeSpecName: "server-conf") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.171734 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "743292d4-f5a5-48cd-bcb0-63fb95ac6910" (UID: "743292d4-f5a5-48cd-bcb0-63fb95ac6910"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.172866 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.172935 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b2812cb-4bae-4379-89af-005c5629b8f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.172960 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173019 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173050 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b2812cb-4bae-4379-89af-005c5629b8f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173100 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173122 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m48t\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-kube-api-access-7m48t\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173152 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173179 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173198 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173232 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.174368 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.174927 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.177223 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.177558 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.173244 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-config-data\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.177931 4676 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/743292d4-f5a5-48cd-bcb0-63fb95ac6910-server-conf\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.177983 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.177995 4676 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/743292d4-f5a5-48cd-bcb0-63fb95ac6910-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.178516 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2b2812cb-4bae-4379-89af-005c5629b8f2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.178806 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.179444 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2b2812cb-4bae-4379-89af-005c5629b8f2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.181328 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2b2812cb-4bae-4379-89af-005c5629b8f2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.188502 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.205827 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m48t\" (UniqueName: \"kubernetes.io/projected/2b2812cb-4bae-4379-89af-005c5629b8f2-kube-api-access-7m48t\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.229930 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"2b2812cb-4bae-4379-89af-005c5629b8f2\") " pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.239661 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.279439 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.361591 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.833523 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.896096 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b2812cb-4bae-4379-89af-005c5629b8f2","Type":"ContainerStarted","Data":"f45505577efe4eb812714233f11819264aacdf20f479ce796cc02559725f3b7d"} Dec 04 15:43:52 crc kubenswrapper[4676]: I1204 15:43:52.897884 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.034086 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.045800 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.066515 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.068815 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.071599 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.071725 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.071972 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.072177 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hf49c" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.072212 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.072991 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.074190 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.077173 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.197659 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.197729 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.197756 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjslc\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-kube-api-access-qjslc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198024 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b5e80e-65ee-42be-bf95-72e121d8e888-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198111 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198560 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198684 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b5e80e-65ee-42be-bf95-72e121d8e888-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198715 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198763 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198928 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.198983 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301515 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301596 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b5e80e-65ee-42be-bf95-72e121d8e888-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301621 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301663 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301724 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301747 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301824 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301867 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301898 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjslc\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-kube-api-access-qjslc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.301976 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b5e80e-65ee-42be-bf95-72e121d8e888-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.302016 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.302275 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.302722 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.303549 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.306627 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.306659 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.306734 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.306958 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.306963 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.307493 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90b5e80e-65ee-42be-bf95-72e121d8e888-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.314033 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.314307 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.314578 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90b5e80e-65ee-42be-bf95-72e121d8e888-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.315761 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.319197 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.319569 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90b5e80e-65ee-42be-bf95-72e121d8e888-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.322151 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjslc\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-kube-api-access-qjslc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.328792 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90b5e80e-65ee-42be-bf95-72e121d8e888-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.336422 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90b5e80e-65ee-42be-bf95-72e121d8e888\") " pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.475024 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-hf49c" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.480199 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.490504 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfec4df-7119-489c-a2e8-17dddd0e5c1d" path="/var/lib/kubelet/pods/6bfec4df-7119-489c-a2e8-17dddd0e5c1d/volumes" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.491364 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743292d4-f5a5-48cd-bcb0-63fb95ac6910" path="/var/lib/kubelet/pods/743292d4-f5a5-48cd-bcb0-63fb95ac6910/volumes" Dec 04 15:43:53 crc kubenswrapper[4676]: I1204 15:43:53.961564 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 04 15:43:53 crc kubenswrapper[4676]: W1204 15:43:53.967657 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90b5e80e_65ee_42be_bf95_72e121d8e888.slice/crio-65c0144986cb4cd7b449a636ffdd98447b2af5bb848726f9a391c9fc3dad76e2 WatchSource:0}: Error finding container 65c0144986cb4cd7b449a636ffdd98447b2af5bb848726f9a391c9fc3dad76e2: Status 404 returned error can't find the container with id 65c0144986cb4cd7b449a636ffdd98447b2af5bb848726f9a391c9fc3dad76e2 Dec 04 15:43:54 crc kubenswrapper[4676]: I1204 15:43:54.920966 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90b5e80e-65ee-42be-bf95-72e121d8e888","Type":"ContainerStarted","Data":"65c0144986cb4cd7b449a636ffdd98447b2af5bb848726f9a391c9fc3dad76e2"} Dec 04 15:43:54 crc kubenswrapper[4676]: I1204 15:43:54.923497 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b2812cb-4bae-4379-89af-005c5629b8f2","Type":"ContainerStarted","Data":"fd106b045923bb21e036e9bbc4295fefcb77d4f78b5a62f0183e223ef748caed"} Dec 04 15:43:56 crc kubenswrapper[4676]: I1204 15:43:56.948741 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90b5e80e-65ee-42be-bf95-72e121d8e888","Type":"ContainerStarted","Data":"4764d6e50d0278f16f324d1cb835af9a55c2c3d6d2d03bae708f443c675c9553"} Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.412750 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-774f646dbc-wbzb2"] Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.416146 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.426459 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.586856 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774f646dbc-wbzb2"] Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.673294 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-swift-storage-0\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.675985 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-svc\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.676044 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-sb\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.676140 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-config\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.676273 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-openstack-edpm-ipam\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.676390 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-nb\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.676539 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blrm6\" (UniqueName: \"kubernetes.io/projected/60cd4419-0728-4945-879c-4964498ae376-kube-api-access-blrm6\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.777604 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-svc\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.777958 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-sb\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.778006 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-config\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.778064 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-openstack-edpm-ipam\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.778112 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-nb\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.778177 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blrm6\" (UniqueName: \"kubernetes.io/projected/60cd4419-0728-4945-879c-4964498ae376-kube-api-access-blrm6\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.778216 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-swift-storage-0\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.779030 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-openstack-edpm-ipam\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.779104 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-sb\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.779143 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-nb\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.779242 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-swift-storage-0\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.779615 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-svc\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.780166 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-config\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:04 crc kubenswrapper[4676]: I1204 15:44:04.798919 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blrm6\" (UniqueName: \"kubernetes.io/projected/60cd4419-0728-4945-879c-4964498ae376-kube-api-access-blrm6\") pod \"dnsmasq-dns-774f646dbc-wbzb2\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:05 crc kubenswrapper[4676]: I1204 15:44:05.047160 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:05 crc kubenswrapper[4676]: I1204 15:44:05.647221 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774f646dbc-wbzb2"] Dec 04 15:44:06 crc kubenswrapper[4676]: I1204 15:44:06.066178 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" event={"ID":"60cd4419-0728-4945-879c-4964498ae376","Type":"ContainerDied","Data":"772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42"} Dec 04 15:44:06 crc kubenswrapper[4676]: I1204 15:44:06.067070 4676 generic.go:334] "Generic (PLEG): container finished" podID="60cd4419-0728-4945-879c-4964498ae376" containerID="772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42" exitCode=0 Dec 04 15:44:06 crc kubenswrapper[4676]: I1204 15:44:06.067186 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" event={"ID":"60cd4419-0728-4945-879c-4964498ae376","Type":"ContainerStarted","Data":"07145908351aa4b6affe08620c662bbf590353ef1355145f9235feca3912995a"} Dec 04 15:44:07 crc kubenswrapper[4676]: I1204 15:44:07.101785 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" event={"ID":"60cd4419-0728-4945-879c-4964498ae376","Type":"ContainerStarted","Data":"d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c"} Dec 04 15:44:07 crc kubenswrapper[4676]: I1204 15:44:07.102313 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:07 crc kubenswrapper[4676]: I1204 15:44:07.130189 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" podStartSLOduration=3.1301585530000002 podStartE2EDuration="3.130158553s" podCreationTimestamp="2025-12-04 15:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:44:07.121640674 +0000 UTC m=+1454.556310541" watchObservedRunningTime="2025-12-04 15:44:07.130158553 +0000 UTC m=+1454.564828410" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.048838 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.125840 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d658544b9-r5sxw"] Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.126689 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" podUID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerName="dnsmasq-dns" containerID="cri-o://648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695" gracePeriod=10 Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.318346 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b864cb897-lcnmv"] Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.320555 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.330500 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b864cb897-lcnmv"] Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.385286 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzsms\" (UniqueName: \"kubernetes.io/projected/7d01e1f6-a481-4501-879f-e099a53f3070-kube-api-access-fzsms\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.385344 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-dns-swift-storage-0\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.385488 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-dns-svc\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.385564 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-ovsdbserver-sb\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.385591 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-config\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.385656 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.385806 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-ovsdbserver-nb\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.488706 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-dns-svc\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.489065 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-ovsdbserver-sb\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.489091 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-config\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.489139 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.489223 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-ovsdbserver-nb\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.489279 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzsms\" (UniqueName: \"kubernetes.io/projected/7d01e1f6-a481-4501-879f-e099a53f3070-kube-api-access-fzsms\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.489301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-dns-swift-storage-0\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.490079 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-dns-svc\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.490243 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-ovsdbserver-nb\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.490283 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-dns-swift-storage-0\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.490494 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-ovsdbserver-sb\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.490984 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-openstack-edpm-ipam\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.491067 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d01e1f6-a481-4501-879f-e099a53f3070-config\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.522370 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzsms\" (UniqueName: \"kubernetes.io/projected/7d01e1f6-a481-4501-879f-e099a53f3070-kube-api-access-fzsms\") pod \"dnsmasq-dns-6b864cb897-lcnmv\" (UID: \"7d01e1f6-a481-4501-879f-e099a53f3070\") " pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.782779 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:15 crc kubenswrapper[4676]: I1204 15:44:15.990525 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.026387 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.026446 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.107852 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-svc\") pod \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.108662 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-swift-storage-0\") pod \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.108709 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz4ww\" (UniqueName: \"kubernetes.io/projected/5e9e8792-ee83-463a-be59-f11e4eaa78e0-kube-api-access-wz4ww\") pod \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.108785 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-config\") pod \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.108986 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-nb\") pod \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.109056 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-sb\") pod \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\" (UID: \"5e9e8792-ee83-463a-be59-f11e4eaa78e0\") " Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.117210 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9e8792-ee83-463a-be59-f11e4eaa78e0-kube-api-access-wz4ww" (OuterVolumeSpecName: "kube-api-access-wz4ww") pod "5e9e8792-ee83-463a-be59-f11e4eaa78e0" (UID: "5e9e8792-ee83-463a-be59-f11e4eaa78e0"). InnerVolumeSpecName "kube-api-access-wz4ww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.183134 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e9e8792-ee83-463a-be59-f11e4eaa78e0" (UID: "5e9e8792-ee83-463a-be59-f11e4eaa78e0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.187358 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e9e8792-ee83-463a-be59-f11e4eaa78e0" (UID: "5e9e8792-ee83-463a-be59-f11e4eaa78e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.188346 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e9e8792-ee83-463a-be59-f11e4eaa78e0" (UID: "5e9e8792-ee83-463a-be59-f11e4eaa78e0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.188392 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-config" (OuterVolumeSpecName: "config") pod "5e9e8792-ee83-463a-be59-f11e4eaa78e0" (UID: "5e9e8792-ee83-463a-be59-f11e4eaa78e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.190017 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5e9e8792-ee83-463a-be59-f11e4eaa78e0" (UID: "5e9e8792-ee83-463a-be59-f11e4eaa78e0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.211780 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.211837 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.211850 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.211858 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.211933 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e9e8792-ee83-463a-be59-f11e4eaa78e0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.211944 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz4ww\" (UniqueName: \"kubernetes.io/projected/5e9e8792-ee83-463a-be59-f11e4eaa78e0-kube-api-access-wz4ww\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.228636 4676 generic.go:334] "Generic (PLEG): container finished" podID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerID="648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695" exitCode=0 Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.228689 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" event={"ID":"5e9e8792-ee83-463a-be59-f11e4eaa78e0","Type":"ContainerDied","Data":"648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695"} Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.228726 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" event={"ID":"5e9e8792-ee83-463a-be59-f11e4eaa78e0","Type":"ContainerDied","Data":"ca924f3bc8887fad574489ee51d48de4f579b2a9835a2efb4719ccbf87ad193d"} Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.228773 4676 scope.go:117] "RemoveContainer" containerID="648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.228991 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d658544b9-r5sxw" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.259111 4676 scope.go:117] "RemoveContainer" containerID="8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.334671 4676 scope.go:117] "RemoveContainer" containerID="648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695" Dec 04 15:44:16 crc kubenswrapper[4676]: E1204 15:44:16.336504 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695\": container with ID starting with 648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695 not found: ID does not exist" containerID="648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.336556 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695"} err="failed to get container status \"648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695\": rpc error: code = NotFound desc = could not find container \"648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695\": container with ID starting with 648955daa8599b2fe027edd498492715549e9144bfaeb0698f328c3e11118695 not found: ID does not exist" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.336584 4676 scope.go:117] "RemoveContainer" containerID="8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90" Dec 04 15:44:16 crc kubenswrapper[4676]: E1204 15:44:16.336932 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90\": container with ID starting with 8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90 not found: ID does not exist" containerID="8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.336962 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90"} err="failed to get container status \"8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90\": rpc error: code = NotFound desc = could not find container \"8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90\": container with ID starting with 8fadc8137af776f904e0d33cb7285983d7335aed0d39b2e0893bb111d9418b90 not found: ID does not exist" Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.346636 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d658544b9-r5sxw"] Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.360825 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d658544b9-r5sxw"] Dec 04 15:44:16 crc kubenswrapper[4676]: W1204 15:44:16.361453 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d01e1f6_a481_4501_879f_e099a53f3070.slice/crio-e89ab4cfe6b406f7fce83a0ba4a570c790aef60018793cbbf4d09acf1b7af73d WatchSource:0}: Error finding container e89ab4cfe6b406f7fce83a0ba4a570c790aef60018793cbbf4d09acf1b7af73d: Status 404 returned error can't find the container with id e89ab4cfe6b406f7fce83a0ba4a570c790aef60018793cbbf4d09acf1b7af73d Dec 04 15:44:16 crc kubenswrapper[4676]: I1204 15:44:16.383238 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b864cb897-lcnmv"] Dec 04 15:44:17 crc kubenswrapper[4676]: I1204 15:44:17.323648 4676 generic.go:334] "Generic (PLEG): container finished" podID="7d01e1f6-a481-4501-879f-e099a53f3070" containerID="136db23ef6ae530e2fc9f41ba4d5b7003734b9f3ebce9fb236de9088b7a868ea" exitCode=0 Dec 04 15:44:17 crc kubenswrapper[4676]: I1204 15:44:17.323763 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" event={"ID":"7d01e1f6-a481-4501-879f-e099a53f3070","Type":"ContainerDied","Data":"136db23ef6ae530e2fc9f41ba4d5b7003734b9f3ebce9fb236de9088b7a868ea"} Dec 04 15:44:17 crc kubenswrapper[4676]: I1204 15:44:17.324249 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" event={"ID":"7d01e1f6-a481-4501-879f-e099a53f3070","Type":"ContainerStarted","Data":"e89ab4cfe6b406f7fce83a0ba4a570c790aef60018793cbbf4d09acf1b7af73d"} Dec 04 15:44:17 crc kubenswrapper[4676]: I1204 15:44:17.410519 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" path="/var/lib/kubelet/pods/5e9e8792-ee83-463a-be59-f11e4eaa78e0/volumes" Dec 04 15:44:18 crc kubenswrapper[4676]: I1204 15:44:18.341013 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" event={"ID":"7d01e1f6-a481-4501-879f-e099a53f3070","Type":"ContainerStarted","Data":"9019ba75ac29fe61e8c30292abb7deac25c65d3cd9abf62ee75661786cfc6d04"} Dec 04 15:44:18 crc kubenswrapper[4676]: I1204 15:44:18.341360 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:18 crc kubenswrapper[4676]: I1204 15:44:18.362520 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" podStartSLOduration=3.362499542 podStartE2EDuration="3.362499542s" podCreationTimestamp="2025-12-04 15:44:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:44:18.358920478 +0000 UTC m=+1465.793590355" watchObservedRunningTime="2025-12-04 15:44:18.362499542 +0000 UTC m=+1465.797169399" Dec 04 15:44:25 crc kubenswrapper[4676]: I1204 15:44:25.785113 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b864cb897-lcnmv" Dec 04 15:44:25 crc kubenswrapper[4676]: I1204 15:44:25.879147 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774f646dbc-wbzb2"] Dec 04 15:44:25 crc kubenswrapper[4676]: I1204 15:44:25.879446 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" podUID="60cd4419-0728-4945-879c-4964498ae376" containerName="dnsmasq-dns" containerID="cri-o://d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c" gracePeriod=10 Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.369942 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.431850 4676 generic.go:334] "Generic (PLEG): container finished" podID="60cd4419-0728-4945-879c-4964498ae376" containerID="d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c" exitCode=0 Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.431973 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.431962 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" event={"ID":"60cd4419-0728-4945-879c-4964498ae376","Type":"ContainerDied","Data":"d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c"} Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.432117 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-774f646dbc-wbzb2" event={"ID":"60cd4419-0728-4945-879c-4964498ae376","Type":"ContainerDied","Data":"07145908351aa4b6affe08620c662bbf590353ef1355145f9235feca3912995a"} Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.432136 4676 scope.go:117] "RemoveContainer" containerID="d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.453068 4676 scope.go:117] "RemoveContainer" containerID="772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.472067 4676 scope.go:117] "RemoveContainer" containerID="d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c" Dec 04 15:44:26 crc kubenswrapper[4676]: E1204 15:44:26.472512 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c\": container with ID starting with d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c not found: ID does not exist" containerID="d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.472575 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c"} err="failed to get container status \"d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c\": rpc error: code = NotFound desc = could not find container \"d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c\": container with ID starting with d7f9e1be93e897530b51d23eddc0b338eaf1800380aefc3a8daf387c4586071c not found: ID does not exist" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.472604 4676 scope.go:117] "RemoveContainer" containerID="772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42" Dec 04 15:44:26 crc kubenswrapper[4676]: E1204 15:44:26.473047 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42\": container with ID starting with 772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42 not found: ID does not exist" containerID="772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.473107 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42"} err="failed to get container status \"772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42\": rpc error: code = NotFound desc = could not find container \"772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42\": container with ID starting with 772c0523d77fde8d60dbdce06ca979b845dc18d022267421a1cd650f93452e42 not found: ID does not exist" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.507282 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-sb\") pod \"60cd4419-0728-4945-879c-4964498ae376\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.507331 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-openstack-edpm-ipam\") pod \"60cd4419-0728-4945-879c-4964498ae376\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.507427 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-config\") pod \"60cd4419-0728-4945-879c-4964498ae376\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.507541 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blrm6\" (UniqueName: \"kubernetes.io/projected/60cd4419-0728-4945-879c-4964498ae376-kube-api-access-blrm6\") pod \"60cd4419-0728-4945-879c-4964498ae376\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.508183 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-svc\") pod \"60cd4419-0728-4945-879c-4964498ae376\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.508210 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-swift-storage-0\") pod \"60cd4419-0728-4945-879c-4964498ae376\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.508230 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-nb\") pod \"60cd4419-0728-4945-879c-4964498ae376\" (UID: \"60cd4419-0728-4945-879c-4964498ae376\") " Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.516641 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cd4419-0728-4945-879c-4964498ae376-kube-api-access-blrm6" (OuterVolumeSpecName: "kube-api-access-blrm6") pod "60cd4419-0728-4945-879c-4964498ae376" (UID: "60cd4419-0728-4945-879c-4964498ae376"). InnerVolumeSpecName "kube-api-access-blrm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.570036 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "60cd4419-0728-4945-879c-4964498ae376" (UID: "60cd4419-0728-4945-879c-4964498ae376"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.571451 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60cd4419-0728-4945-879c-4964498ae376" (UID: "60cd4419-0728-4945-879c-4964498ae376"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.578798 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60cd4419-0728-4945-879c-4964498ae376" (UID: "60cd4419-0728-4945-879c-4964498ae376"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.579829 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-config" (OuterVolumeSpecName: "config") pod "60cd4419-0728-4945-879c-4964498ae376" (UID: "60cd4419-0728-4945-879c-4964498ae376"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.585687 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60cd4419-0728-4945-879c-4964498ae376" (UID: "60cd4419-0728-4945-879c-4964498ae376"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.586766 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60cd4419-0728-4945-879c-4964498ae376" (UID: "60cd4419-0728-4945-879c-4964498ae376"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.611045 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blrm6\" (UniqueName: \"kubernetes.io/projected/60cd4419-0728-4945-879c-4964498ae376-kube-api-access-blrm6\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.611083 4676 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.611096 4676 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.611110 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.611122 4676 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.611132 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.611144 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60cd4419-0728-4945-879c-4964498ae376-config\") on node \"crc\" DevicePath \"\"" Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.785338 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774f646dbc-wbzb2"] Dec 04 15:44:26 crc kubenswrapper[4676]: I1204 15:44:26.799731 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-774f646dbc-wbzb2"] Dec 04 15:44:27 crc kubenswrapper[4676]: I1204 15:44:27.398050 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cd4419-0728-4945-879c-4964498ae376" path="/var/lib/kubelet/pods/60cd4419-0728-4945-879c-4964498ae376/volumes" Dec 04 15:44:27 crc kubenswrapper[4676]: I1204 15:44:27.448721 4676 generic.go:334] "Generic (PLEG): container finished" podID="2b2812cb-4bae-4379-89af-005c5629b8f2" containerID="fd106b045923bb21e036e9bbc4295fefcb77d4f78b5a62f0183e223ef748caed" exitCode=0 Dec 04 15:44:27 crc kubenswrapper[4676]: I1204 15:44:27.448795 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b2812cb-4bae-4379-89af-005c5629b8f2","Type":"ContainerDied","Data":"fd106b045923bb21e036e9bbc4295fefcb77d4f78b5a62f0183e223ef748caed"} Dec 04 15:44:28 crc kubenswrapper[4676]: I1204 15:44:28.460385 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2b2812cb-4bae-4379-89af-005c5629b8f2","Type":"ContainerStarted","Data":"f391b6c0429851d5ca9065473d260c9dd39b70ff728bb371a4c395d570ae3f8d"} Dec 04 15:44:28 crc kubenswrapper[4676]: I1204 15:44:28.460921 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 04 15:44:28 crc kubenswrapper[4676]: I1204 15:44:28.462356 4676 generic.go:334] "Generic (PLEG): container finished" podID="90b5e80e-65ee-42be-bf95-72e121d8e888" containerID="4764d6e50d0278f16f324d1cb835af9a55c2c3d6d2d03bae708f443c675c9553" exitCode=0 Dec 04 15:44:28 crc kubenswrapper[4676]: I1204 15:44:28.462428 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90b5e80e-65ee-42be-bf95-72e121d8e888","Type":"ContainerDied","Data":"4764d6e50d0278f16f324d1cb835af9a55c2c3d6d2d03bae708f443c675c9553"} Dec 04 15:44:28 crc kubenswrapper[4676]: I1204 15:44:28.504889 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.504873064 podStartE2EDuration="37.504873064s" podCreationTimestamp="2025-12-04 15:43:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:44:28.485240541 +0000 UTC m=+1475.919910398" watchObservedRunningTime="2025-12-04 15:44:28.504873064 +0000 UTC m=+1475.939542921" Dec 04 15:44:29 crc kubenswrapper[4676]: I1204 15:44:29.475721 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90b5e80e-65ee-42be-bf95-72e121d8e888","Type":"ContainerStarted","Data":"458995190be8151fc3d84dc6f5292adf7c20b4acdbb5f49a05bcc123287c73b3"} Dec 04 15:44:29 crc kubenswrapper[4676]: I1204 15:44:29.477066 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:44:42 crc kubenswrapper[4676]: I1204 15:44:42.364122 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 04 15:44:42 crc kubenswrapper[4676]: I1204 15:44:42.392406 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=49.392365033 podStartE2EDuration="49.392365033s" podCreationTimestamp="2025-12-04 15:43:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 15:44:29.506968238 +0000 UTC m=+1476.941638095" watchObservedRunningTime="2025-12-04 15:44:42.392365033 +0000 UTC m=+1489.827034880" Dec 04 15:44:43 crc kubenswrapper[4676]: I1204 15:44:43.483101 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.052621 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp"] Dec 04 15:44:44 crc kubenswrapper[4676]: E1204 15:44:44.053538 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cd4419-0728-4945-879c-4964498ae376" containerName="init" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.053580 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cd4419-0728-4945-879c-4964498ae376" containerName="init" Dec 04 15:44:44 crc kubenswrapper[4676]: E1204 15:44:44.053599 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerName="dnsmasq-dns" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.053696 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerName="dnsmasq-dns" Dec 04 15:44:44 crc kubenswrapper[4676]: E1204 15:44:44.053712 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cd4419-0728-4945-879c-4964498ae376" containerName="dnsmasq-dns" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.053724 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cd4419-0728-4945-879c-4964498ae376" containerName="dnsmasq-dns" Dec 04 15:44:44 crc kubenswrapper[4676]: E1204 15:44:44.053738 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerName="init" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.053744 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerName="init" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.054002 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9e8792-ee83-463a-be59-f11e4eaa78e0" containerName="dnsmasq-dns" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.054027 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="60cd4419-0728-4945-879c-4964498ae376" containerName="dnsmasq-dns" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.054763 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.056680 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.057068 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.057091 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.057139 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.064987 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp"] Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.231998 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.232048 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.232075 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmj9h\" (UniqueName: \"kubernetes.io/projected/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-kube-api-access-gmj9h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.232106 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.334430 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.334479 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.334519 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmj9h\" (UniqueName: \"kubernetes.io/projected/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-kube-api-access-gmj9h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.334555 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.340761 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.341265 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.344836 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.356417 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmj9h\" (UniqueName: \"kubernetes.io/projected/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-kube-api-access-gmj9h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.376626 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:44:44 crc kubenswrapper[4676]: I1204 15:44:44.958147 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp"] Dec 04 15:44:45 crc kubenswrapper[4676]: I1204 15:44:45.655664 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" event={"ID":"43fc84a7-d9a0-4eba-93e6-c72e566a2b99","Type":"ContainerStarted","Data":"f7f1f9501fb2ad1f60418db655924de6c9771275f3d3f88c72ca6adfc1adb52f"} Dec 04 15:44:46 crc kubenswrapper[4676]: I1204 15:44:46.027551 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:44:46 crc kubenswrapper[4676]: I1204 15:44:46.027621 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:44:53 crc kubenswrapper[4676]: I1204 15:44:53.660479 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:44:54 crc kubenswrapper[4676]: I1204 15:44:54.774192 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" event={"ID":"43fc84a7-d9a0-4eba-93e6-c72e566a2b99","Type":"ContainerStarted","Data":"6c51b4782b66b7128dd90cad7b1caccbb3440493b67c081c2d9abe7d71a32db8"} Dec 04 15:44:54 crc kubenswrapper[4676]: I1204 15:44:54.793010 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" podStartSLOduration=2.099613364 podStartE2EDuration="10.792994579s" podCreationTimestamp="2025-12-04 15:44:44 +0000 UTC" firstStartedPulling="2025-12-04 15:44:44.962393266 +0000 UTC m=+1492.397063123" lastFinishedPulling="2025-12-04 15:44:53.655774491 +0000 UTC m=+1501.090444338" observedRunningTime="2025-12-04 15:44:54.791754303 +0000 UTC m=+1502.226424160" watchObservedRunningTime="2025-12-04 15:44:54.792994579 +0000 UTC m=+1502.227664436" Dec 04 15:44:58 crc kubenswrapper[4676]: I1204 15:44:58.914525 4676 scope.go:117] "RemoveContainer" containerID="2e36588d3aa3e3b96d231812753dcec1011d788732fec58db777c6c362982fec" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.150553 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g"] Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.153312 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.155954 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.159041 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.166874 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g"] Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.283923 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtzpn\" (UniqueName: \"kubernetes.io/projected/20692633-6767-45ee-8e4b-e89de3a134a5-kube-api-access-gtzpn\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.284015 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20692633-6767-45ee-8e4b-e89de3a134a5-secret-volume\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.284041 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20692633-6767-45ee-8e4b-e89de3a134a5-config-volume\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.385668 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtzpn\" (UniqueName: \"kubernetes.io/projected/20692633-6767-45ee-8e4b-e89de3a134a5-kube-api-access-gtzpn\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.385757 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20692633-6767-45ee-8e4b-e89de3a134a5-secret-volume\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.385788 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20692633-6767-45ee-8e4b-e89de3a134a5-config-volume\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.386658 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20692633-6767-45ee-8e4b-e89de3a134a5-config-volume\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.393022 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20692633-6767-45ee-8e4b-e89de3a134a5-secret-volume\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.403260 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtzpn\" (UniqueName: \"kubernetes.io/projected/20692633-6767-45ee-8e4b-e89de3a134a5-kube-api-access-gtzpn\") pod \"collect-profiles-29414385-9656g\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.482546 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:00 crc kubenswrapper[4676]: I1204 15:45:00.963731 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g"] Dec 04 15:45:01 crc kubenswrapper[4676]: I1204 15:45:01.852735 4676 generic.go:334] "Generic (PLEG): container finished" podID="20692633-6767-45ee-8e4b-e89de3a134a5" containerID="9cec22e7763aa207a6df1fdd9de1966b4a24c8a61cdcfd873a14e02da0955f9e" exitCode=0 Dec 04 15:45:01 crc kubenswrapper[4676]: I1204 15:45:01.853129 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" event={"ID":"20692633-6767-45ee-8e4b-e89de3a134a5","Type":"ContainerDied","Data":"9cec22e7763aa207a6df1fdd9de1966b4a24c8a61cdcfd873a14e02da0955f9e"} Dec 04 15:45:01 crc kubenswrapper[4676]: I1204 15:45:01.853208 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" event={"ID":"20692633-6767-45ee-8e4b-e89de3a134a5","Type":"ContainerStarted","Data":"7e19fcdc599877d01c45e6b22c6f637e6b689fc670da15951eceafea10624834"} Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.227819 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.346424 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20692633-6767-45ee-8e4b-e89de3a134a5-config-volume\") pod \"20692633-6767-45ee-8e4b-e89de3a134a5\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.346680 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20692633-6767-45ee-8e4b-e89de3a134a5-secret-volume\") pod \"20692633-6767-45ee-8e4b-e89de3a134a5\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.347229 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtzpn\" (UniqueName: \"kubernetes.io/projected/20692633-6767-45ee-8e4b-e89de3a134a5-kube-api-access-gtzpn\") pod \"20692633-6767-45ee-8e4b-e89de3a134a5\" (UID: \"20692633-6767-45ee-8e4b-e89de3a134a5\") " Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.347457 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20692633-6767-45ee-8e4b-e89de3a134a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "20692633-6767-45ee-8e4b-e89de3a134a5" (UID: "20692633-6767-45ee-8e4b-e89de3a134a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.348116 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/20692633-6767-45ee-8e4b-e89de3a134a5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.356023 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20692633-6767-45ee-8e4b-e89de3a134a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "20692633-6767-45ee-8e4b-e89de3a134a5" (UID: "20692633-6767-45ee-8e4b-e89de3a134a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.356253 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20692633-6767-45ee-8e4b-e89de3a134a5-kube-api-access-gtzpn" (OuterVolumeSpecName: "kube-api-access-gtzpn") pod "20692633-6767-45ee-8e4b-e89de3a134a5" (UID: "20692633-6767-45ee-8e4b-e89de3a134a5"). InnerVolumeSpecName "kube-api-access-gtzpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.449798 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/20692633-6767-45ee-8e4b-e89de3a134a5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.450177 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtzpn\" (UniqueName: \"kubernetes.io/projected/20692633-6767-45ee-8e4b-e89de3a134a5-kube-api-access-gtzpn\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.876020 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" event={"ID":"20692633-6767-45ee-8e4b-e89de3a134a5","Type":"ContainerDied","Data":"7e19fcdc599877d01c45e6b22c6f637e6b689fc670da15951eceafea10624834"} Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.876076 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e19fcdc599877d01c45e6b22c6f637e6b689fc670da15951eceafea10624834" Dec 04 15:45:03 crc kubenswrapper[4676]: I1204 15:45:03.876142 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g" Dec 04 15:45:05 crc kubenswrapper[4676]: I1204 15:45:05.897052 4676 generic.go:334] "Generic (PLEG): container finished" podID="43fc84a7-d9a0-4eba-93e6-c72e566a2b99" containerID="6c51b4782b66b7128dd90cad7b1caccbb3440493b67c081c2d9abe7d71a32db8" exitCode=0 Dec 04 15:45:05 crc kubenswrapper[4676]: I1204 15:45:05.897176 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" event={"ID":"43fc84a7-d9a0-4eba-93e6-c72e566a2b99","Type":"ContainerDied","Data":"6c51b4782b66b7128dd90cad7b1caccbb3440493b67c081c2d9abe7d71a32db8"} Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.347638 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.432778 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmj9h\" (UniqueName: \"kubernetes.io/projected/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-kube-api-access-gmj9h\") pod \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.432871 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-inventory\") pod \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.432972 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-repo-setup-combined-ca-bundle\") pod \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.433083 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-ssh-key\") pod \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\" (UID: \"43fc84a7-d9a0-4eba-93e6-c72e566a2b99\") " Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.438540 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "43fc84a7-d9a0-4eba-93e6-c72e566a2b99" (UID: "43fc84a7-d9a0-4eba-93e6-c72e566a2b99"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.440990 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-kube-api-access-gmj9h" (OuterVolumeSpecName: "kube-api-access-gmj9h") pod "43fc84a7-d9a0-4eba-93e6-c72e566a2b99" (UID: "43fc84a7-d9a0-4eba-93e6-c72e566a2b99"). InnerVolumeSpecName "kube-api-access-gmj9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.466519 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "43fc84a7-d9a0-4eba-93e6-c72e566a2b99" (UID: "43fc84a7-d9a0-4eba-93e6-c72e566a2b99"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.469637 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-inventory" (OuterVolumeSpecName: "inventory") pod "43fc84a7-d9a0-4eba-93e6-c72e566a2b99" (UID: "43fc84a7-d9a0-4eba-93e6-c72e566a2b99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.535878 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmj9h\" (UniqueName: \"kubernetes.io/projected/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-kube-api-access-gmj9h\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.537000 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.537048 4676 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.537063 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/43fc84a7-d9a0-4eba-93e6-c72e566a2b99-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.920299 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" event={"ID":"43fc84a7-d9a0-4eba-93e6-c72e566a2b99","Type":"ContainerDied","Data":"f7f1f9501fb2ad1f60418db655924de6c9771275f3d3f88c72ca6adfc1adb52f"} Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.920660 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f1f9501fb2ad1f60418db655924de6c9771275f3d3f88c72ca6adfc1adb52f" Dec 04 15:45:07 crc kubenswrapper[4676]: I1204 15:45:07.920353 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.002805 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp"] Dec 04 15:45:08 crc kubenswrapper[4676]: E1204 15:45:08.003644 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20692633-6767-45ee-8e4b-e89de3a134a5" containerName="collect-profiles" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.003673 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="20692633-6767-45ee-8e4b-e89de3a134a5" containerName="collect-profiles" Dec 04 15:45:08 crc kubenswrapper[4676]: E1204 15:45:08.003707 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fc84a7-d9a0-4eba-93e6-c72e566a2b99" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.003748 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fc84a7-d9a0-4eba-93e6-c72e566a2b99" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.004073 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fc84a7-d9a0-4eba-93e6-c72e566a2b99" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.004117 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="20692633-6767-45ee-8e4b-e89de3a134a5" containerName="collect-profiles" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.005418 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.008352 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.008897 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.009174 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.009281 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.028646 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp"] Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.148984 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.149072 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6ms\" (UniqueName: \"kubernetes.io/projected/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-kube-api-access-9v6ms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.149515 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.250853 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.250970 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6ms\" (UniqueName: \"kubernetes.io/projected/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-kube-api-access-9v6ms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.251021 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.255829 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.256527 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.269577 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6ms\" (UniqueName: \"kubernetes.io/projected/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-kube-api-access-9v6ms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-w64mp\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.327494 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.909215 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp"] Dec 04 15:45:08 crc kubenswrapper[4676]: I1204 15:45:08.933203 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" event={"ID":"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd","Type":"ContainerStarted","Data":"e78691c59e8539c04684b343515475d89aa269569347503735771dc1061d6e14"} Dec 04 15:45:09 crc kubenswrapper[4676]: I1204 15:45:09.943862 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" event={"ID":"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd","Type":"ContainerStarted","Data":"f868eeeb059ee32ba636e615e91c95199ee24c38305755ba8468bcab16c2b5a5"} Dec 04 15:45:09 crc kubenswrapper[4676]: I1204 15:45:09.963625 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" podStartSLOduration=2.260764803 podStartE2EDuration="2.963600674s" podCreationTimestamp="2025-12-04 15:45:07 +0000 UTC" firstStartedPulling="2025-12-04 15:45:08.917738242 +0000 UTC m=+1516.352408099" lastFinishedPulling="2025-12-04 15:45:09.620574113 +0000 UTC m=+1517.055243970" observedRunningTime="2025-12-04 15:45:09.96347114 +0000 UTC m=+1517.398141007" watchObservedRunningTime="2025-12-04 15:45:09.963600674 +0000 UTC m=+1517.398270541" Dec 04 15:45:12 crc kubenswrapper[4676]: I1204 15:45:12.980402 4676 generic.go:334] "Generic (PLEG): container finished" podID="1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd" containerID="f868eeeb059ee32ba636e615e91c95199ee24c38305755ba8468bcab16c2b5a5" exitCode=0 Dec 04 15:45:12 crc kubenswrapper[4676]: I1204 15:45:12.980488 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" event={"ID":"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd","Type":"ContainerDied","Data":"f868eeeb059ee32ba636e615e91c95199ee24c38305755ba8468bcab16c2b5a5"} Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.575961 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.618522 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-inventory\") pod \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.618580 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v6ms\" (UniqueName: \"kubernetes.io/projected/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-kube-api-access-9v6ms\") pod \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.618677 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-ssh-key\") pod \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\" (UID: \"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd\") " Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.635105 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-kube-api-access-9v6ms" (OuterVolumeSpecName: "kube-api-access-9v6ms") pod "1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd" (UID: "1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd"). InnerVolumeSpecName "kube-api-access-9v6ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.649868 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd" (UID: "1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.652369 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-inventory" (OuterVolumeSpecName: "inventory") pod "1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd" (UID: "1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.720787 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.720824 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:14 crc kubenswrapper[4676]: I1204 15:45:14.720838 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v6ms\" (UniqueName: \"kubernetes.io/projected/1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd-kube-api-access-9v6ms\") on node \"crc\" DevicePath \"\"" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.000740 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.000738 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-w64mp" event={"ID":"1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd","Type":"ContainerDied","Data":"e78691c59e8539c04684b343515475d89aa269569347503735771dc1061d6e14"} Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.000867 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e78691c59e8539c04684b343515475d89aa269569347503735771dc1061d6e14" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.072688 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw"] Dec 04 15:45:15 crc kubenswrapper[4676]: E1204 15:45:15.073513 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.073548 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.073873 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.074707 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.077242 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.077288 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.077480 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.077479 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.097541 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw"] Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.128252 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.128340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pg8d\" (UniqueName: \"kubernetes.io/projected/7778f969-2f94-4830-8685-bb42b6a9fd23-kube-api-access-7pg8d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.128493 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.128554 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.230372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pg8d\" (UniqueName: \"kubernetes.io/projected/7778f969-2f94-4830-8685-bb42b6a9fd23-kube-api-access-7pg8d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.230491 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.230551 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.230689 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.234415 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.234432 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.241964 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.256749 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pg8d\" (UniqueName: \"kubernetes.io/projected/7778f969-2f94-4830-8685-bb42b6a9fd23-kube-api-access-7pg8d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:15 crc kubenswrapper[4676]: I1204 15:45:15.396780 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:45:16 crc kubenswrapper[4676]: I1204 15:45:16.008372 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw"] Dec 04 15:45:16 crc kubenswrapper[4676]: I1204 15:45:16.027318 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:45:16 crc kubenswrapper[4676]: I1204 15:45:16.027380 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:45:16 crc kubenswrapper[4676]: I1204 15:45:16.027428 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:45:16 crc kubenswrapper[4676]: I1204 15:45:16.028312 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:45:16 crc kubenswrapper[4676]: I1204 15:45:16.028388 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" gracePeriod=600 Dec 04 15:45:16 crc kubenswrapper[4676]: E1204 15:45:16.158755 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:45:17 crc kubenswrapper[4676]: I1204 15:45:17.022630 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" event={"ID":"7778f969-2f94-4830-8685-bb42b6a9fd23","Type":"ContainerStarted","Data":"7dc943e58ed2151f6e7a1e47f76961848d719c80e688ae84617b47498f884f49"} Dec 04 15:45:17 crc kubenswrapper[4676]: I1204 15:45:17.023157 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" event={"ID":"7778f969-2f94-4830-8685-bb42b6a9fd23","Type":"ContainerStarted","Data":"77a1be6faecaea0eeaa3e11a4cd93b55f26dccc84aa134b8828768694b90050a"} Dec 04 15:45:17 crc kubenswrapper[4676]: I1204 15:45:17.025427 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" exitCode=0 Dec 04 15:45:17 crc kubenswrapper[4676]: I1204 15:45:17.025456 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994"} Dec 04 15:45:17 crc kubenswrapper[4676]: I1204 15:45:17.025533 4676 scope.go:117] "RemoveContainer" containerID="4ed31aaa37dc8e9548191807986356b721b0f7ff822299d24779fcd58f9d4ea2" Dec 04 15:45:17 crc kubenswrapper[4676]: I1204 15:45:17.128691 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:45:17 crc kubenswrapper[4676]: E1204 15:45:17.129243 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:45:17 crc kubenswrapper[4676]: I1204 15:45:17.144501 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" podStartSLOduration=1.7391536890000001 podStartE2EDuration="2.144466508s" podCreationTimestamp="2025-12-04 15:45:15 +0000 UTC" firstStartedPulling="2025-12-04 15:45:16.01547141 +0000 UTC m=+1523.450141267" lastFinishedPulling="2025-12-04 15:45:16.420784229 +0000 UTC m=+1523.855454086" observedRunningTime="2025-12-04 15:45:17.144099438 +0000 UTC m=+1524.578769295" watchObservedRunningTime="2025-12-04 15:45:17.144466508 +0000 UTC m=+1524.579136365" Dec 04 15:45:31 crc kubenswrapper[4676]: I1204 15:45:31.383979 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:45:31 crc kubenswrapper[4676]: E1204 15:45:31.384606 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:45:44 crc kubenswrapper[4676]: I1204 15:45:44.385537 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:45:44 crc kubenswrapper[4676]: E1204 15:45:44.387693 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.094512 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gw7z8"] Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.109059 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.112565 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gw7z8"] Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.207559 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-catalog-content\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.207850 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv64t\" (UniqueName: \"kubernetes.io/projected/2762bc5b-7a16-4adc-8835-76bc0e5bde99-kube-api-access-wv64t\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.207964 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-utilities\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.309744 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-catalog-content\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.309828 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv64t\" (UniqueName: \"kubernetes.io/projected/2762bc5b-7a16-4adc-8835-76bc0e5bde99-kube-api-access-wv64t\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.309860 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-utilities\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.310328 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-utilities\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.310430 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-catalog-content\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.330509 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv64t\" (UniqueName: \"kubernetes.io/projected/2762bc5b-7a16-4adc-8835-76bc0e5bde99-kube-api-access-wv64t\") pod \"certified-operators-gw7z8\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:47 crc kubenswrapper[4676]: I1204 15:45:47.511152 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:48 crc kubenswrapper[4676]: I1204 15:45:48.086886 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gw7z8"] Dec 04 15:45:48 crc kubenswrapper[4676]: I1204 15:45:48.440735 4676 generic.go:334] "Generic (PLEG): container finished" podID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerID="05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569" exitCode=0 Dec 04 15:45:48 crc kubenswrapper[4676]: I1204 15:45:48.440785 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw7z8" event={"ID":"2762bc5b-7a16-4adc-8835-76bc0e5bde99","Type":"ContainerDied","Data":"05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569"} Dec 04 15:45:48 crc kubenswrapper[4676]: I1204 15:45:48.441113 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw7z8" event={"ID":"2762bc5b-7a16-4adc-8835-76bc0e5bde99","Type":"ContainerStarted","Data":"fc262e3299de90b8805b4866048bf9a3ac8ff9e7562a322c8ce2e512d61c13ad"} Dec 04 15:45:48 crc kubenswrapper[4676]: I1204 15:45:48.442890 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 15:45:50 crc kubenswrapper[4676]: I1204 15:45:50.463556 4676 generic.go:334] "Generic (PLEG): container finished" podID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerID="59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb" exitCode=0 Dec 04 15:45:50 crc kubenswrapper[4676]: I1204 15:45:50.464280 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw7z8" event={"ID":"2762bc5b-7a16-4adc-8835-76bc0e5bde99","Type":"ContainerDied","Data":"59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb"} Dec 04 15:45:51 crc kubenswrapper[4676]: I1204 15:45:51.478447 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw7z8" event={"ID":"2762bc5b-7a16-4adc-8835-76bc0e5bde99","Type":"ContainerStarted","Data":"29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f"} Dec 04 15:45:51 crc kubenswrapper[4676]: I1204 15:45:51.505209 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gw7z8" podStartSLOduration=2.025564026 podStartE2EDuration="4.505189051s" podCreationTimestamp="2025-12-04 15:45:47 +0000 UTC" firstStartedPulling="2025-12-04 15:45:48.442481219 +0000 UTC m=+1555.877151076" lastFinishedPulling="2025-12-04 15:45:50.922106244 +0000 UTC m=+1558.356776101" observedRunningTime="2025-12-04 15:45:51.50207963 +0000 UTC m=+1558.936749487" watchObservedRunningTime="2025-12-04 15:45:51.505189051 +0000 UTC m=+1558.939858908" Dec 04 15:45:57 crc kubenswrapper[4676]: I1204 15:45:57.384729 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:45:57 crc kubenswrapper[4676]: E1204 15:45:57.385527 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:45:57 crc kubenswrapper[4676]: I1204 15:45:57.512255 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:57 crc kubenswrapper[4676]: I1204 15:45:57.512312 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:57 crc kubenswrapper[4676]: I1204 15:45:57.564992 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:57 crc kubenswrapper[4676]: I1204 15:45:57.624718 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:45:57 crc kubenswrapper[4676]: I1204 15:45:57.810111 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gw7z8"] Dec 04 15:45:59 crc kubenswrapper[4676]: I1204 15:45:59.019897 4676 scope.go:117] "RemoveContainer" containerID="11e68f22087cbef6fd18fafc8e8fc08b35bde36e595de05e4c3639967d15e93d" Dec 04 15:45:59 crc kubenswrapper[4676]: I1204 15:45:59.065188 4676 scope.go:117] "RemoveContainer" containerID="5ff0e123d2871311010f4c700658038c77d25579500ac55a0cf708fc3b6ba537" Dec 04 15:45:59 crc kubenswrapper[4676]: I1204 15:45:59.580577 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gw7z8" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="registry-server" containerID="cri-o://29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f" gracePeriod=2 Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.065998 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.201720 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv64t\" (UniqueName: \"kubernetes.io/projected/2762bc5b-7a16-4adc-8835-76bc0e5bde99-kube-api-access-wv64t\") pod \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.201966 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-utilities\") pod \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.202055 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-catalog-content\") pod \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\" (UID: \"2762bc5b-7a16-4adc-8835-76bc0e5bde99\") " Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.202895 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-utilities" (OuterVolumeSpecName: "utilities") pod "2762bc5b-7a16-4adc-8835-76bc0e5bde99" (UID: "2762bc5b-7a16-4adc-8835-76bc0e5bde99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.207557 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2762bc5b-7a16-4adc-8835-76bc0e5bde99-kube-api-access-wv64t" (OuterVolumeSpecName: "kube-api-access-wv64t") pod "2762bc5b-7a16-4adc-8835-76bc0e5bde99" (UID: "2762bc5b-7a16-4adc-8835-76bc0e5bde99"). InnerVolumeSpecName "kube-api-access-wv64t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.304808 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv64t\" (UniqueName: \"kubernetes.io/projected/2762bc5b-7a16-4adc-8835-76bc0e5bde99-kube-api-access-wv64t\") on node \"crc\" DevicePath \"\"" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.304854 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.323418 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2762bc5b-7a16-4adc-8835-76bc0e5bde99" (UID: "2762bc5b-7a16-4adc-8835-76bc0e5bde99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.406614 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2762bc5b-7a16-4adc-8835-76bc0e5bde99-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.592922 4676 generic.go:334] "Generic (PLEG): container finished" podID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerID="29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f" exitCode=0 Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.593012 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gw7z8" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.593009 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw7z8" event={"ID":"2762bc5b-7a16-4adc-8835-76bc0e5bde99","Type":"ContainerDied","Data":"29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f"} Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.593160 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gw7z8" event={"ID":"2762bc5b-7a16-4adc-8835-76bc0e5bde99","Type":"ContainerDied","Data":"fc262e3299de90b8805b4866048bf9a3ac8ff9e7562a322c8ce2e512d61c13ad"} Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.593186 4676 scope.go:117] "RemoveContainer" containerID="29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.631256 4676 scope.go:117] "RemoveContainer" containerID="59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.638250 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gw7z8"] Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.648601 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gw7z8"] Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.653609 4676 scope.go:117] "RemoveContainer" containerID="05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.717692 4676 scope.go:117] "RemoveContainer" containerID="29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f" Dec 04 15:46:00 crc kubenswrapper[4676]: E1204 15:46:00.718456 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f\": container with ID starting with 29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f not found: ID does not exist" containerID="29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.718503 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f"} err="failed to get container status \"29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f\": rpc error: code = NotFound desc = could not find container \"29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f\": container with ID starting with 29d3c460173c6cdcb0624b5d66d86116e0837bc262fb814d96584f4ec8400b6f not found: ID does not exist" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.718535 4676 scope.go:117] "RemoveContainer" containerID="59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb" Dec 04 15:46:00 crc kubenswrapper[4676]: E1204 15:46:00.718998 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb\": container with ID starting with 59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb not found: ID does not exist" containerID="59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.719022 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb"} err="failed to get container status \"59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb\": rpc error: code = NotFound desc = could not find container \"59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb\": container with ID starting with 59d8d8b4bde3be5447f6bd64b82775d165baf65674fbf4215c4d7a10ba5bc5cb not found: ID does not exist" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.719036 4676 scope.go:117] "RemoveContainer" containerID="05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569" Dec 04 15:46:00 crc kubenswrapper[4676]: E1204 15:46:00.719561 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569\": container with ID starting with 05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569 not found: ID does not exist" containerID="05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569" Dec 04 15:46:00 crc kubenswrapper[4676]: I1204 15:46:00.719668 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569"} err="failed to get container status \"05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569\": rpc error: code = NotFound desc = could not find container \"05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569\": container with ID starting with 05b94e32f483a536cd4674fc024c6ae3da7e697f3bb15d14909fcfa85ba76569 not found: ID does not exist" Dec 04 15:46:01 crc kubenswrapper[4676]: I1204 15:46:01.396419 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" path="/var/lib/kubelet/pods/2762bc5b-7a16-4adc-8835-76bc0e5bde99/volumes" Dec 04 15:46:08 crc kubenswrapper[4676]: I1204 15:46:08.384276 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:46:08 crc kubenswrapper[4676]: E1204 15:46:08.385030 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:46:21 crc kubenswrapper[4676]: I1204 15:46:21.384323 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:46:21 crc kubenswrapper[4676]: E1204 15:46:21.385220 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:46:34 crc kubenswrapper[4676]: I1204 15:46:34.384291 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:46:34 crc kubenswrapper[4676]: E1204 15:46:34.385066 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:46:49 crc kubenswrapper[4676]: I1204 15:46:49.386367 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:46:49 crc kubenswrapper[4676]: E1204 15:46:49.387327 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.468951 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v92hf"] Dec 04 15:46:53 crc kubenswrapper[4676]: E1204 15:46:53.471229 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="extract-utilities" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.471357 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="extract-utilities" Dec 04 15:46:53 crc kubenswrapper[4676]: E1204 15:46:53.471454 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="extract-content" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.471559 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="extract-content" Dec 04 15:46:53 crc kubenswrapper[4676]: E1204 15:46:53.471666 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="registry-server" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.471741 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="registry-server" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.472131 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2762bc5b-7a16-4adc-8835-76bc0e5bde99" containerName="registry-server" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.474166 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.479533 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v92hf"] Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.526289 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-utilities\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.526459 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cthsf\" (UniqueName: \"kubernetes.io/projected/573a3f90-9310-4348-999a-2d0d705f86d7-kube-api-access-cthsf\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.526614 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-catalog-content\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.629868 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-utilities\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.630009 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cthsf\" (UniqueName: \"kubernetes.io/projected/573a3f90-9310-4348-999a-2d0d705f86d7-kube-api-access-cthsf\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.630086 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-catalog-content\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.630641 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-utilities\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.630664 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-catalog-content\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.650692 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cthsf\" (UniqueName: \"kubernetes.io/projected/573a3f90-9310-4348-999a-2d0d705f86d7-kube-api-access-cthsf\") pod \"community-operators-v92hf\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:53 crc kubenswrapper[4676]: I1204 15:46:53.814752 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:46:54 crc kubenswrapper[4676]: I1204 15:46:54.375067 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v92hf"] Dec 04 15:46:55 crc kubenswrapper[4676]: I1204 15:46:55.204065 4676 generic.go:334] "Generic (PLEG): container finished" podID="573a3f90-9310-4348-999a-2d0d705f86d7" containerID="8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4" exitCode=0 Dec 04 15:46:55 crc kubenswrapper[4676]: I1204 15:46:55.204427 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92hf" event={"ID":"573a3f90-9310-4348-999a-2d0d705f86d7","Type":"ContainerDied","Data":"8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4"} Dec 04 15:46:55 crc kubenswrapper[4676]: I1204 15:46:55.204470 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92hf" event={"ID":"573a3f90-9310-4348-999a-2d0d705f86d7","Type":"ContainerStarted","Data":"9df358bb21d3452bfc4d7aecdf97d84f62b66e837af1ad52fca0ab5f10964d5b"} Dec 04 15:46:57 crc kubenswrapper[4676]: I1204 15:46:57.262245 4676 generic.go:334] "Generic (PLEG): container finished" podID="573a3f90-9310-4348-999a-2d0d705f86d7" containerID="d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775" exitCode=0 Dec 04 15:46:57 crc kubenswrapper[4676]: I1204 15:46:57.262807 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92hf" event={"ID":"573a3f90-9310-4348-999a-2d0d705f86d7","Type":"ContainerDied","Data":"d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775"} Dec 04 15:46:58 crc kubenswrapper[4676]: I1204 15:46:58.275889 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92hf" event={"ID":"573a3f90-9310-4348-999a-2d0d705f86d7","Type":"ContainerStarted","Data":"221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700"} Dec 04 15:46:58 crc kubenswrapper[4676]: I1204 15:46:58.305483 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v92hf" podStartSLOduration=2.49503865 podStartE2EDuration="5.305459569s" podCreationTimestamp="2025-12-04 15:46:53 +0000 UTC" firstStartedPulling="2025-12-04 15:46:55.208192449 +0000 UTC m=+1622.642862306" lastFinishedPulling="2025-12-04 15:46:58.018613368 +0000 UTC m=+1625.453283225" observedRunningTime="2025-12-04 15:46:58.294701105 +0000 UTC m=+1625.729370982" watchObservedRunningTime="2025-12-04 15:46:58.305459569 +0000 UTC m=+1625.740129426" Dec 04 15:46:59 crc kubenswrapper[4676]: I1204 15:46:59.154756 4676 scope.go:117] "RemoveContainer" containerID="767196b55b820c811f159ad655fdad46b26d039f0b4a40b416c3f227556037b7" Dec 04 15:46:59 crc kubenswrapper[4676]: I1204 15:46:59.202198 4676 scope.go:117] "RemoveContainer" containerID="76e8ec3687c595b74a30ee8b2620faaaa2a2ddacd7461b0200f45a4341ebb4de" Dec 04 15:46:59 crc kubenswrapper[4676]: I1204 15:46:59.229611 4676 scope.go:117] "RemoveContainer" containerID="c82492a192734375701e59a66be12946fadc6db4a6f6b952e3ed209ee42a79d2" Dec 04 15:46:59 crc kubenswrapper[4676]: I1204 15:46:59.274575 4676 scope.go:117] "RemoveContainer" containerID="d283cd29bebe9125919aa14c5070366b65637553c2c29c614f873042dfd3c923" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.532424 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qkbfk"] Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.534992 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.567213 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkbfk"] Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.577102 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht2jj\" (UniqueName: \"kubernetes.io/projected/96abd097-100f-4694-962d-85d3cbdb86b3-kube-api-access-ht2jj\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.577293 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-catalog-content\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.577335 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-utilities\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.680507 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-catalog-content\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.680614 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-utilities\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.680771 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht2jj\" (UniqueName: \"kubernetes.io/projected/96abd097-100f-4694-962d-85d3cbdb86b3-kube-api-access-ht2jj\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.681078 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-catalog-content\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.681199 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-utilities\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.701604 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht2jj\" (UniqueName: \"kubernetes.io/projected/96abd097-100f-4694-962d-85d3cbdb86b3-kube-api-access-ht2jj\") pod \"redhat-marketplace-qkbfk\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:00 crc kubenswrapper[4676]: I1204 15:47:00.862267 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:01 crc kubenswrapper[4676]: I1204 15:47:01.410925 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkbfk"] Dec 04 15:47:02 crc kubenswrapper[4676]: I1204 15:47:02.320059 4676 generic.go:334] "Generic (PLEG): container finished" podID="96abd097-100f-4694-962d-85d3cbdb86b3" containerID="28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44" exitCode=0 Dec 04 15:47:02 crc kubenswrapper[4676]: I1204 15:47:02.320119 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkbfk" event={"ID":"96abd097-100f-4694-962d-85d3cbdb86b3","Type":"ContainerDied","Data":"28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44"} Dec 04 15:47:02 crc kubenswrapper[4676]: I1204 15:47:02.320319 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkbfk" event={"ID":"96abd097-100f-4694-962d-85d3cbdb86b3","Type":"ContainerStarted","Data":"7d2bf8b45f8a97c2be535d54b14e44d6349a4f3d1cf60adcb782e513abfb14e2"} Dec 04 15:47:03 crc kubenswrapper[4676]: I1204 15:47:03.332980 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkbfk" event={"ID":"96abd097-100f-4694-962d-85d3cbdb86b3","Type":"ContainerStarted","Data":"c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702"} Dec 04 15:47:03 crc kubenswrapper[4676]: I1204 15:47:03.394369 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:47:03 crc kubenswrapper[4676]: E1204 15:47:03.394861 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:47:03 crc kubenswrapper[4676]: I1204 15:47:03.815049 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:47:03 crc kubenswrapper[4676]: I1204 15:47:03.816196 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:47:03 crc kubenswrapper[4676]: I1204 15:47:03.865146 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:47:04 crc kubenswrapper[4676]: I1204 15:47:04.345987 4676 generic.go:334] "Generic (PLEG): container finished" podID="96abd097-100f-4694-962d-85d3cbdb86b3" containerID="c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702" exitCode=0 Dec 04 15:47:04 crc kubenswrapper[4676]: I1204 15:47:04.346086 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkbfk" event={"ID":"96abd097-100f-4694-962d-85d3cbdb86b3","Type":"ContainerDied","Data":"c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702"} Dec 04 15:47:04 crc kubenswrapper[4676]: I1204 15:47:04.407100 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:47:06 crc kubenswrapper[4676]: I1204 15:47:06.249645 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v92hf"] Dec 04 15:47:06 crc kubenswrapper[4676]: I1204 15:47:06.375438 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkbfk" event={"ID":"96abd097-100f-4694-962d-85d3cbdb86b3","Type":"ContainerStarted","Data":"0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049"} Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.384761 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v92hf" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="registry-server" containerID="cri-o://221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700" gracePeriod=2 Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.908310 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.932438 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qkbfk" podStartSLOduration=4.518077895 podStartE2EDuration="7.932414628s" podCreationTimestamp="2025-12-04 15:47:00 +0000 UTC" firstStartedPulling="2025-12-04 15:47:02.322616664 +0000 UTC m=+1629.757286521" lastFinishedPulling="2025-12-04 15:47:05.736953387 +0000 UTC m=+1633.171623254" observedRunningTime="2025-12-04 15:47:06.408857095 +0000 UTC m=+1633.843526952" watchObservedRunningTime="2025-12-04 15:47:07.932414628 +0000 UTC m=+1635.367084475" Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.956250 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-catalog-content\") pod \"573a3f90-9310-4348-999a-2d0d705f86d7\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.956429 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cthsf\" (UniqueName: \"kubernetes.io/projected/573a3f90-9310-4348-999a-2d0d705f86d7-kube-api-access-cthsf\") pod \"573a3f90-9310-4348-999a-2d0d705f86d7\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.956530 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-utilities\") pod \"573a3f90-9310-4348-999a-2d0d705f86d7\" (UID: \"573a3f90-9310-4348-999a-2d0d705f86d7\") " Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.957529 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-utilities" (OuterVolumeSpecName: "utilities") pod "573a3f90-9310-4348-999a-2d0d705f86d7" (UID: "573a3f90-9310-4348-999a-2d0d705f86d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:47:07 crc kubenswrapper[4676]: I1204 15:47:07.964508 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/573a3f90-9310-4348-999a-2d0d705f86d7-kube-api-access-cthsf" (OuterVolumeSpecName: "kube-api-access-cthsf") pod "573a3f90-9310-4348-999a-2d0d705f86d7" (UID: "573a3f90-9310-4348-999a-2d0d705f86d7"). InnerVolumeSpecName "kube-api-access-cthsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.021846 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "573a3f90-9310-4348-999a-2d0d705f86d7" (UID: "573a3f90-9310-4348-999a-2d0d705f86d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.058477 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cthsf\" (UniqueName: \"kubernetes.io/projected/573a3f90-9310-4348-999a-2d0d705f86d7-kube-api-access-cthsf\") on node \"crc\" DevicePath \"\"" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.058722 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.058781 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/573a3f90-9310-4348-999a-2d0d705f86d7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.399782 4676 generic.go:334] "Generic (PLEG): container finished" podID="573a3f90-9310-4348-999a-2d0d705f86d7" containerID="221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700" exitCode=0 Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.399841 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v92hf" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.399845 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92hf" event={"ID":"573a3f90-9310-4348-999a-2d0d705f86d7","Type":"ContainerDied","Data":"221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700"} Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.400616 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v92hf" event={"ID":"573a3f90-9310-4348-999a-2d0d705f86d7","Type":"ContainerDied","Data":"9df358bb21d3452bfc4d7aecdf97d84f62b66e837af1ad52fca0ab5f10964d5b"} Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.400647 4676 scope.go:117] "RemoveContainer" containerID="221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.427239 4676 scope.go:117] "RemoveContainer" containerID="d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.435192 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v92hf"] Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.444741 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v92hf"] Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.464432 4676 scope.go:117] "RemoveContainer" containerID="8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.509065 4676 scope.go:117] "RemoveContainer" containerID="221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700" Dec 04 15:47:08 crc kubenswrapper[4676]: E1204 15:47:08.509592 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700\": container with ID starting with 221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700 not found: ID does not exist" containerID="221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.509630 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700"} err="failed to get container status \"221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700\": rpc error: code = NotFound desc = could not find container \"221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700\": container with ID starting with 221689127d8dd8f8630dd6dfd8228d7430c9a0769d5c5c728d2f6e4f2dbec700 not found: ID does not exist" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.509652 4676 scope.go:117] "RemoveContainer" containerID="d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775" Dec 04 15:47:08 crc kubenswrapper[4676]: E1204 15:47:08.510053 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775\": container with ID starting with d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775 not found: ID does not exist" containerID="d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.510077 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775"} err="failed to get container status \"d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775\": rpc error: code = NotFound desc = could not find container \"d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775\": container with ID starting with d57fc1466ed8fae75ead2cab8ada083b345bb2ffe6d1043f7479aa70d29b2775 not found: ID does not exist" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.510093 4676 scope.go:117] "RemoveContainer" containerID="8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4" Dec 04 15:47:08 crc kubenswrapper[4676]: E1204 15:47:08.510328 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4\": container with ID starting with 8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4 not found: ID does not exist" containerID="8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4" Dec 04 15:47:08 crc kubenswrapper[4676]: I1204 15:47:08.510372 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4"} err="failed to get container status \"8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4\": rpc error: code = NotFound desc = could not find container \"8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4\": container with ID starting with 8bdd4e001139839991edd909eead8db9530f0eb73d4fa072b2255d3f74d1a1b4 not found: ID does not exist" Dec 04 15:47:09 crc kubenswrapper[4676]: I1204 15:47:09.399454 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" path="/var/lib/kubelet/pods/573a3f90-9310-4348-999a-2d0d705f86d7/volumes" Dec 04 15:47:10 crc kubenswrapper[4676]: I1204 15:47:10.862886 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:10 crc kubenswrapper[4676]: I1204 15:47:10.863852 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:10 crc kubenswrapper[4676]: I1204 15:47:10.910383 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:11 crc kubenswrapper[4676]: I1204 15:47:11.472732 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:12 crc kubenswrapper[4676]: I1204 15:47:12.246783 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkbfk"] Dec 04 15:47:13 crc kubenswrapper[4676]: I1204 15:47:13.472895 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qkbfk" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="registry-server" containerID="cri-o://0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049" gracePeriod=2 Dec 04 15:47:13 crc kubenswrapper[4676]: I1204 15:47:13.961291 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.083203 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht2jj\" (UniqueName: \"kubernetes.io/projected/96abd097-100f-4694-962d-85d3cbdb86b3-kube-api-access-ht2jj\") pod \"96abd097-100f-4694-962d-85d3cbdb86b3\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.083307 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-utilities\") pod \"96abd097-100f-4694-962d-85d3cbdb86b3\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.083398 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-catalog-content\") pod \"96abd097-100f-4694-962d-85d3cbdb86b3\" (UID: \"96abd097-100f-4694-962d-85d3cbdb86b3\") " Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.084209 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-utilities" (OuterVolumeSpecName: "utilities") pod "96abd097-100f-4694-962d-85d3cbdb86b3" (UID: "96abd097-100f-4694-962d-85d3cbdb86b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.088284 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96abd097-100f-4694-962d-85d3cbdb86b3-kube-api-access-ht2jj" (OuterVolumeSpecName: "kube-api-access-ht2jj") pod "96abd097-100f-4694-962d-85d3cbdb86b3" (UID: "96abd097-100f-4694-962d-85d3cbdb86b3"). InnerVolumeSpecName "kube-api-access-ht2jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.105285 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96abd097-100f-4694-962d-85d3cbdb86b3" (UID: "96abd097-100f-4694-962d-85d3cbdb86b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.186423 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.186468 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96abd097-100f-4694-962d-85d3cbdb86b3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.186485 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht2jj\" (UniqueName: \"kubernetes.io/projected/96abd097-100f-4694-962d-85d3cbdb86b3-kube-api-access-ht2jj\") on node \"crc\" DevicePath \"\"" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.483892 4676 generic.go:334] "Generic (PLEG): container finished" podID="96abd097-100f-4694-962d-85d3cbdb86b3" containerID="0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049" exitCode=0 Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.483974 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qkbfk" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.483993 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkbfk" event={"ID":"96abd097-100f-4694-962d-85d3cbdb86b3","Type":"ContainerDied","Data":"0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049"} Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.484057 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qkbfk" event={"ID":"96abd097-100f-4694-962d-85d3cbdb86b3","Type":"ContainerDied","Data":"7d2bf8b45f8a97c2be535d54b14e44d6349a4f3d1cf60adcb782e513abfb14e2"} Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.484082 4676 scope.go:117] "RemoveContainer" containerID="0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.508426 4676 scope.go:117] "RemoveContainer" containerID="c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.521361 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkbfk"] Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.530613 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qkbfk"] Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.532933 4676 scope.go:117] "RemoveContainer" containerID="28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.576560 4676 scope.go:117] "RemoveContainer" containerID="0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049" Dec 04 15:47:14 crc kubenswrapper[4676]: E1204 15:47:14.576967 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049\": container with ID starting with 0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049 not found: ID does not exist" containerID="0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.577011 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049"} err="failed to get container status \"0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049\": rpc error: code = NotFound desc = could not find container \"0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049\": container with ID starting with 0d47cd3568f31e879a8b47058b7f92e3d8d046bdeeb24540db0ecdd5d1022049 not found: ID does not exist" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.577040 4676 scope.go:117] "RemoveContainer" containerID="c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702" Dec 04 15:47:14 crc kubenswrapper[4676]: E1204 15:47:14.577446 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702\": container with ID starting with c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702 not found: ID does not exist" containerID="c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.577475 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702"} err="failed to get container status \"c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702\": rpc error: code = NotFound desc = could not find container \"c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702\": container with ID starting with c25e715fc5000b0cb02e7d740abd438ebdac4caac6a0c42867d6610c53c6c702 not found: ID does not exist" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.577502 4676 scope.go:117] "RemoveContainer" containerID="28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44" Dec 04 15:47:14 crc kubenswrapper[4676]: E1204 15:47:14.577829 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44\": container with ID starting with 28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44 not found: ID does not exist" containerID="28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44" Dec 04 15:47:14 crc kubenswrapper[4676]: I1204 15:47:14.577851 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44"} err="failed to get container status \"28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44\": rpc error: code = NotFound desc = could not find container \"28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44\": container with ID starting with 28986cdd5e179b4b416525830a988318093d6b2dbcbafa3a2a19208565099b44 not found: ID does not exist" Dec 04 15:47:15 crc kubenswrapper[4676]: I1204 15:47:15.394856 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" path="/var/lib/kubelet/pods/96abd097-100f-4694-962d-85d3cbdb86b3/volumes" Dec 04 15:47:18 crc kubenswrapper[4676]: I1204 15:47:18.385168 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:47:18 crc kubenswrapper[4676]: E1204 15:47:18.385677 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:47:31 crc kubenswrapper[4676]: I1204 15:47:31.385888 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:47:31 crc kubenswrapper[4676]: E1204 15:47:31.386800 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:47:42 crc kubenswrapper[4676]: I1204 15:47:42.385387 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:47:42 crc kubenswrapper[4676]: E1204 15:47:42.386288 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:47:55 crc kubenswrapper[4676]: I1204 15:47:55.499679 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:47:55 crc kubenswrapper[4676]: E1204 15:47:55.521656 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:47:59 crc kubenswrapper[4676]: I1204 15:47:59.358316 4676 scope.go:117] "RemoveContainer" containerID="34d8976aeadb642d2fff1879d582b70ebeafabbea2efb16561257aa1765964ca" Dec 04 15:47:59 crc kubenswrapper[4676]: I1204 15:47:59.389073 4676 scope.go:117] "RemoveContainer" containerID="acdb243aa13217a33e82fc1675df0c8a3eec896b26e577e06ccc51da0e81faec" Dec 04 15:47:59 crc kubenswrapper[4676]: I1204 15:47:59.410061 4676 scope.go:117] "RemoveContainer" containerID="5e73396ad1cc4b0ace8797fe26901e61a00c0ad55b59615922b6a5ecc7498573" Dec 04 15:47:59 crc kubenswrapper[4676]: I1204 15:47:59.433704 4676 scope.go:117] "RemoveContainer" containerID="619024e3a34effa642e1116e56a9ab51b73a34e02409595f29e6205fddccb644" Dec 04 15:48:02 crc kubenswrapper[4676]: I1204 15:48:02.086765 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-wthdz"] Dec 04 15:48:02 crc kubenswrapper[4676]: I1204 15:48:02.098432 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5bnn8"] Dec 04 15:48:02 crc kubenswrapper[4676]: I1204 15:48:02.110828 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5bnn8"] Dec 04 15:48:02 crc kubenswrapper[4676]: I1204 15:48:02.121033 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-wthdz"] Dec 04 15:48:03 crc kubenswrapper[4676]: I1204 15:48:03.395937 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d480ec-6f21-494a-b5b6-d58c1842077d" path="/var/lib/kubelet/pods/01d480ec-6f21-494a-b5b6-d58c1842077d/volumes" Dec 04 15:48:03 crc kubenswrapper[4676]: I1204 15:48:03.396635 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c18d40-c03a-4c87-aa2c-ad743179dd6f" path="/var/lib/kubelet/pods/a0c18d40-c03a-4c87-aa2c-ad743179dd6f/volumes" Dec 04 15:48:05 crc kubenswrapper[4676]: I1204 15:48:05.027638 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zps4k"] Dec 04 15:48:05 crc kubenswrapper[4676]: I1204 15:48:05.037756 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zps4k"] Dec 04 15:48:05 crc kubenswrapper[4676]: I1204 15:48:05.397105 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42384168-5df1-4d2c-aec1-501e67ceb44e" path="/var/lib/kubelet/pods/42384168-5df1-4d2c-aec1-501e67ceb44e/volumes" Dec 04 15:48:07 crc kubenswrapper[4676]: I1204 15:48:07.384422 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:48:07 crc kubenswrapper[4676]: E1204 15:48:07.385287 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:48:15 crc kubenswrapper[4676]: I1204 15:48:15.038204 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fa18-account-create-zzrzb"] Dec 04 15:48:15 crc kubenswrapper[4676]: I1204 15:48:15.069058 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fa18-account-create-zzrzb"] Dec 04 15:48:15 crc kubenswrapper[4676]: I1204 15:48:15.396596 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e778f2-8276-4efa-b77c-ea0c86d5f5ff" path="/var/lib/kubelet/pods/91e778f2-8276-4efa-b77c-ea0c86d5f5ff/volumes" Dec 04 15:48:16 crc kubenswrapper[4676]: I1204 15:48:16.030644 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-07ee-account-create-qb5s4"] Dec 04 15:48:16 crc kubenswrapper[4676]: I1204 15:48:16.041436 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-07ee-account-create-qb5s4"] Dec 04 15:48:17 crc kubenswrapper[4676]: I1204 15:48:17.399616 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0ed69b4-f9ab-4a12-8bed-d6e639f518d1" path="/var/lib/kubelet/pods/e0ed69b4-f9ab-4a12-8bed-d6e639f518d1/volumes" Dec 04 15:48:18 crc kubenswrapper[4676]: I1204 15:48:18.029376 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-7cc2-account-create-kcqmh"] Dec 04 15:48:18 crc kubenswrapper[4676]: I1204 15:48:18.039012 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-7cc2-account-create-kcqmh"] Dec 04 15:48:19 crc kubenswrapper[4676]: I1204 15:48:19.395602 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca609f9-fb1d-4be1-a208-d386b661cebf" path="/var/lib/kubelet/pods/bca609f9-fb1d-4be1-a208-d386b661cebf/volumes" Dec 04 15:48:21 crc kubenswrapper[4676]: I1204 15:48:21.384478 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:48:21 crc kubenswrapper[4676]: E1204 15:48:21.385006 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:48:35 crc kubenswrapper[4676]: I1204 15:48:35.383958 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:48:35 crc kubenswrapper[4676]: E1204 15:48:35.385927 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:48:42 crc kubenswrapper[4676]: I1204 15:48:42.041935 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-78ffb7b6cf-46b4r" podUID="10ac9a17-d069-484c-9f44-baaada4618f8" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 04 15:48:42 crc kubenswrapper[4676]: I1204 15:48:42.050392 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-m4p7c"] Dec 04 15:48:42 crc kubenswrapper[4676]: I1204 15:48:42.063602 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-vwnjh"] Dec 04 15:48:42 crc kubenswrapper[4676]: I1204 15:48:42.074457 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-m4p7c"] Dec 04 15:48:42 crc kubenswrapper[4676]: I1204 15:48:42.086758 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-vwnjh"] Dec 04 15:48:43 crc kubenswrapper[4676]: I1204 15:48:43.397174 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fec9aa8-63ba-40bb-9217-590ae458da93" path="/var/lib/kubelet/pods/3fec9aa8-63ba-40bb-9217-590ae458da93/volumes" Dec 04 15:48:43 crc kubenswrapper[4676]: I1204 15:48:43.398289 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="504e890d-08fd-41c1-b1cd-f0a9480e17df" path="/var/lib/kubelet/pods/504e890d-08fd-41c1-b1cd-f0a9480e17df/volumes" Dec 04 15:48:49 crc kubenswrapper[4676]: I1204 15:48:49.033067 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lxbp8"] Dec 04 15:48:49 crc kubenswrapper[4676]: I1204 15:48:49.044208 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lxbp8"] Dec 04 15:48:49 crc kubenswrapper[4676]: I1204 15:48:49.384719 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:48:49 crc kubenswrapper[4676]: E1204 15:48:49.385043 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:48:49 crc kubenswrapper[4676]: I1204 15:48:49.395546 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20d4d2e0-ea26-476f-b7e6-fd922c301ba0" path="/var/lib/kubelet/pods/20d4d2e0-ea26-476f-b7e6-fd922c301ba0/volumes" Dec 04 15:48:55 crc kubenswrapper[4676]: I1204 15:48:55.058878 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-gh7lx"] Dec 04 15:48:55 crc kubenswrapper[4676]: I1204 15:48:55.069998 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-gh7lx"] Dec 04 15:48:55 crc kubenswrapper[4676]: I1204 15:48:55.078198 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2e42-account-create-hzn75"] Dec 04 15:48:55 crc kubenswrapper[4676]: I1204 15:48:55.088383 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2e42-account-create-hzn75"] Dec 04 15:48:55 crc kubenswrapper[4676]: I1204 15:48:55.398103 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e4c2e6a-2e63-4f64-9e3c-c14e6226727a" path="/var/lib/kubelet/pods/6e4c2e6a-2e63-4f64-9e3c-c14e6226727a/volumes" Dec 04 15:48:55 crc kubenswrapper[4676]: I1204 15:48:55.398831 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e23715-9b6f-4307-97f5-36289341911d" path="/var/lib/kubelet/pods/85e23715-9b6f-4307-97f5-36289341911d/volumes" Dec 04 15:48:58 crc kubenswrapper[4676]: I1204 15:48:58.028631 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f615-account-create-nscvm"] Dec 04 15:48:58 crc kubenswrapper[4676]: I1204 15:48:58.039741 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f615-account-create-nscvm"] Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.396348 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742fbc26-b6af-40c0-bd23-9c6bacbbe61c" path="/var/lib/kubelet/pods/742fbc26-b6af-40c0-bd23-9c6bacbbe61c/volumes" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.520450 4676 scope.go:117] "RemoveContainer" containerID="34de63e20dff3df88f43f5c080c02447ca98045b330017a58da2a557c7a04fa8" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.547449 4676 scope.go:117] "RemoveContainer" containerID="7def4f7329a205bf4dab65733e4954db001e46a20fe3862d1c3b58576f64f8dd" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.604115 4676 scope.go:117] "RemoveContainer" containerID="60ff3b9eb0e5b32f3f88a2b5a018541eb684066e81047d95f9f5804ef5698b36" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.664878 4676 scope.go:117] "RemoveContainer" containerID="24eaa47e6dfb02dd8e0da3a9bc69fa571bd81f3bbc2f5185e5940f61761077a9" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.717919 4676 scope.go:117] "RemoveContainer" containerID="6fac95b64599a521bfb8281e0706e50dbde4706b3fde5b39166b44c0178d6204" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.777864 4676 scope.go:117] "RemoveContainer" containerID="1625cfd497b9024c296cf4c1b522225d33c4e8616be121609fed2408b8d0a134" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.820400 4676 scope.go:117] "RemoveContainer" containerID="4b5711510172d5ec812817348a57b7874b77a37a96dfbf1d4f1ab15887a7d7cd" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.838510 4676 scope.go:117] "RemoveContainer" containerID="49bb83efa5bc52af304067610f962d6f160148b43b60fa43c218dfab9fe9a3b7" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.859030 4676 scope.go:117] "RemoveContainer" containerID="12339c749f3fd592625db4ac9a7ae46f8f0dfc6fd55f38fff1828475441daea4" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.883795 4676 scope.go:117] "RemoveContainer" containerID="3fcb2c69f4e90e86f37965b7715aa2fd4009c24cf1d8665b16c457c6c0eff841" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.908378 4676 scope.go:117] "RemoveContainer" containerID="61cb2c916525a98cf6a703c62e3947df817ddaff8a4fe1391b8b2b16b28219cd" Dec 04 15:48:59 crc kubenswrapper[4676]: I1204 15:48:59.930256 4676 scope.go:117] "RemoveContainer" containerID="44bcde57ce210f1f46a6edbc76309d1f472463aa1c1d13dc7f10b8a8e30431f8" Dec 04 15:49:00 crc kubenswrapper[4676]: I1204 15:49:00.099008 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-xhjnc"] Dec 04 15:49:00 crc kubenswrapper[4676]: I1204 15:49:00.112352 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-xhjnc"] Dec 04 15:49:01 crc kubenswrapper[4676]: I1204 15:49:01.030493 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-cmrp2"] Dec 04 15:49:01 crc kubenswrapper[4676]: I1204 15:49:01.041581 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-cmrp2"] Dec 04 15:49:01 crc kubenswrapper[4676]: I1204 15:49:01.440529 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="063e66f9-8c76-4a2c-9392-f35b247d1304" path="/var/lib/kubelet/pods/063e66f9-8c76-4a2c-9392-f35b247d1304/volumes" Dec 04 15:49:01 crc kubenswrapper[4676]: I1204 15:49:01.441316 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12cc7c9f-d211-490e-b297-0a250646e111" path="/var/lib/kubelet/pods/12cc7c9f-d211-490e-b297-0a250646e111/volumes" Dec 04 15:49:02 crc kubenswrapper[4676]: I1204 15:49:02.913226 4676 generic.go:334] "Generic (PLEG): container finished" podID="7778f969-2f94-4830-8685-bb42b6a9fd23" containerID="7dc943e58ed2151f6e7a1e47f76961848d719c80e688ae84617b47498f884f49" exitCode=0 Dec 04 15:49:02 crc kubenswrapper[4676]: I1204 15:49:02.913318 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" event={"ID":"7778f969-2f94-4830-8685-bb42b6a9fd23","Type":"ContainerDied","Data":"7dc943e58ed2151f6e7a1e47f76961848d719c80e688ae84617b47498f884f49"} Dec 04 15:49:03 crc kubenswrapper[4676]: I1204 15:49:03.392736 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:49:03 crc kubenswrapper[4676]: E1204 15:49:03.393280 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.540775 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.669009 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-bootstrap-combined-ca-bundle\") pod \"7778f969-2f94-4830-8685-bb42b6a9fd23\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.669169 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pg8d\" (UniqueName: \"kubernetes.io/projected/7778f969-2f94-4830-8685-bb42b6a9fd23-kube-api-access-7pg8d\") pod \"7778f969-2f94-4830-8685-bb42b6a9fd23\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.669265 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-inventory\") pod \"7778f969-2f94-4830-8685-bb42b6a9fd23\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.669290 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-ssh-key\") pod \"7778f969-2f94-4830-8685-bb42b6a9fd23\" (UID: \"7778f969-2f94-4830-8685-bb42b6a9fd23\") " Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.680120 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7778f969-2f94-4830-8685-bb42b6a9fd23" (UID: "7778f969-2f94-4830-8685-bb42b6a9fd23"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.680118 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7778f969-2f94-4830-8685-bb42b6a9fd23-kube-api-access-7pg8d" (OuterVolumeSpecName: "kube-api-access-7pg8d") pod "7778f969-2f94-4830-8685-bb42b6a9fd23" (UID: "7778f969-2f94-4830-8685-bb42b6a9fd23"). InnerVolumeSpecName "kube-api-access-7pg8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.715992 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-inventory" (OuterVolumeSpecName: "inventory") pod "7778f969-2f94-4830-8685-bb42b6a9fd23" (UID: "7778f969-2f94-4830-8685-bb42b6a9fd23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.718555 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7778f969-2f94-4830-8685-bb42b6a9fd23" (UID: "7778f969-2f94-4830-8685-bb42b6a9fd23"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.770890 4676 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.770966 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pg8d\" (UniqueName: \"kubernetes.io/projected/7778f969-2f94-4830-8685-bb42b6a9fd23-kube-api-access-7pg8d\") on node \"crc\" DevicePath \"\"" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.770976 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.770985 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7778f969-2f94-4830-8685-bb42b6a9fd23-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.937365 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" event={"ID":"7778f969-2f94-4830-8685-bb42b6a9fd23","Type":"ContainerDied","Data":"77a1be6faecaea0eeaa3e11a4cd93b55f26dccc84aa134b8828768694b90050a"} Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.937421 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77a1be6faecaea0eeaa3e11a4cd93b55f26dccc84aa134b8828768694b90050a" Dec 04 15:49:04 crc kubenswrapper[4676]: I1204 15:49:04.937455 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.046921 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz"] Dec 04 15:49:05 crc kubenswrapper[4676]: E1204 15:49:05.047713 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="extract-utilities" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.047745 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="extract-utilities" Dec 04 15:49:05 crc kubenswrapper[4676]: E1204 15:49:05.047766 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="extract-content" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.047773 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="extract-content" Dec 04 15:49:05 crc kubenswrapper[4676]: E1204 15:49:05.047789 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="extract-content" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.047795 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="extract-content" Dec 04 15:49:05 crc kubenswrapper[4676]: E1204 15:49:05.047822 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="registry-server" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.047828 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="registry-server" Dec 04 15:49:05 crc kubenswrapper[4676]: E1204 15:49:05.047843 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="registry-server" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.047849 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="registry-server" Dec 04 15:49:05 crc kubenswrapper[4676]: E1204 15:49:05.047860 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7778f969-2f94-4830-8685-bb42b6a9fd23" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.047868 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7778f969-2f94-4830-8685-bb42b6a9fd23" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 15:49:05 crc kubenswrapper[4676]: E1204 15:49:05.047877 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="extract-utilities" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.047883 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="extract-utilities" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.048151 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="573a3f90-9310-4348-999a-2d0d705f86d7" containerName="registry-server" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.048177 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="96abd097-100f-4694-962d-85d3cbdb86b3" containerName="registry-server" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.048205 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7778f969-2f94-4830-8685-bb42b6a9fd23" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.049083 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.052412 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.052452 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.052424 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.052549 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.061298 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz"] Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.081111 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.081158 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.081199 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrt5v\" (UniqueName: \"kubernetes.io/projected/59ed14d8-9b88-49e8-ac61-213b3a6908e7-kube-api-access-vrt5v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.183937 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.184028 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.184072 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrt5v\" (UniqueName: \"kubernetes.io/projected/59ed14d8-9b88-49e8-ac61-213b3a6908e7-kube-api-access-vrt5v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.190554 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.191391 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.207234 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrt5v\" (UniqueName: \"kubernetes.io/projected/59ed14d8-9b88-49e8-ac61-213b3a6908e7-kube-api-access-vrt5v\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.366476 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.921297 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz"] Dec 04 15:49:05 crc kubenswrapper[4676]: W1204 15:49:05.926566 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59ed14d8_9b88_49e8_ac61_213b3a6908e7.slice/crio-f8e22d4e9b8a4fcff06efa5811c449e8b8b2516bc1d56db833b531edae5f8c07 WatchSource:0}: Error finding container f8e22d4e9b8a4fcff06efa5811c449e8b8b2516bc1d56db833b531edae5f8c07: Status 404 returned error can't find the container with id f8e22d4e9b8a4fcff06efa5811c449e8b8b2516bc1d56db833b531edae5f8c07 Dec 04 15:49:05 crc kubenswrapper[4676]: I1204 15:49:05.948368 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" event={"ID":"59ed14d8-9b88-49e8-ac61-213b3a6908e7","Type":"ContainerStarted","Data":"f8e22d4e9b8a4fcff06efa5811c449e8b8b2516bc1d56db833b531edae5f8c07"} Dec 04 15:49:06 crc kubenswrapper[4676]: I1204 15:49:06.957684 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" event={"ID":"59ed14d8-9b88-49e8-ac61-213b3a6908e7","Type":"ContainerStarted","Data":"48f5085074710e598cc4226190de748eaf5b8cedf864b773b891868ed506e855"} Dec 04 15:49:06 crc kubenswrapper[4676]: I1204 15:49:06.978009 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" podStartSLOduration=1.495879907 podStartE2EDuration="1.977979313s" podCreationTimestamp="2025-12-04 15:49:05 +0000 UTC" firstStartedPulling="2025-12-04 15:49:05.929870197 +0000 UTC m=+1753.364540054" lastFinishedPulling="2025-12-04 15:49:06.411969603 +0000 UTC m=+1753.846639460" observedRunningTime="2025-12-04 15:49:06.976573313 +0000 UTC m=+1754.411243170" watchObservedRunningTime="2025-12-04 15:49:06.977979313 +0000 UTC m=+1754.412649170" Dec 04 15:49:14 crc kubenswrapper[4676]: I1204 15:49:14.032596 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0109-account-create-zfgjz"] Dec 04 15:49:14 crc kubenswrapper[4676]: I1204 15:49:14.043054 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2cec-account-create-fwvgn"] Dec 04 15:49:14 crc kubenswrapper[4676]: I1204 15:49:14.051849 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0109-account-create-zfgjz"] Dec 04 15:49:14 crc kubenswrapper[4676]: I1204 15:49:14.059692 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2cec-account-create-fwvgn"] Dec 04 15:49:15 crc kubenswrapper[4676]: I1204 15:49:15.401405 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7820f5-8870-4da3-8576-328966fdc552" path="/var/lib/kubelet/pods/0a7820f5-8870-4da3-8576-328966fdc552/volumes" Dec 04 15:49:15 crc kubenswrapper[4676]: I1204 15:49:15.402028 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4af1e4-191b-483a-9886-f07cc9829079" path="/var/lib/kubelet/pods/8e4af1e4-191b-483a-9886-f07cc9829079/volumes" Dec 04 15:49:17 crc kubenswrapper[4676]: I1204 15:49:17.384205 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:49:17 crc kubenswrapper[4676]: E1204 15:49:17.385359 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:49:30 crc kubenswrapper[4676]: I1204 15:49:30.385033 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:49:30 crc kubenswrapper[4676]: E1204 15:49:30.385761 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:49:41 crc kubenswrapper[4676]: I1204 15:49:41.384628 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:49:41 crc kubenswrapper[4676]: E1204 15:49:41.385554 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:49:52 crc kubenswrapper[4676]: I1204 15:49:52.384840 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:49:52 crc kubenswrapper[4676]: E1204 15:49:52.385739 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:49:56 crc kubenswrapper[4676]: I1204 15:49:56.044616 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jlg26"] Dec 04 15:49:56 crc kubenswrapper[4676]: I1204 15:49:56.057822 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jlg26"] Dec 04 15:49:57 crc kubenswrapper[4676]: I1204 15:49:57.123631 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-llvh8"] Dec 04 15:49:57 crc kubenswrapper[4676]: I1204 15:49:57.135022 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-6b4sd"] Dec 04 15:49:57 crc kubenswrapper[4676]: I1204 15:49:57.146710 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-llvh8"] Dec 04 15:49:57 crc kubenswrapper[4676]: I1204 15:49:57.158174 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-6b4sd"] Dec 04 15:49:57 crc kubenswrapper[4676]: I1204 15:49:57.398107 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4feecc1c-e63e-4063-947d-4c2c619525a7" path="/var/lib/kubelet/pods/4feecc1c-e63e-4063-947d-4c2c619525a7/volumes" Dec 04 15:49:57 crc kubenswrapper[4676]: I1204 15:49:57.399007 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c93c13-31d1-4762-9457-90e32c63873e" path="/var/lib/kubelet/pods/89c93c13-31d1-4762-9457-90e32c63873e/volumes" Dec 04 15:49:57 crc kubenswrapper[4676]: I1204 15:49:57.399685 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7efd4bd-bb88-4422-9bd3-04ddb66d35a9" path="/var/lib/kubelet/pods/b7efd4bd-bb88-4422-9bd3-04ddb66d35a9/volumes" Dec 04 15:50:00 crc kubenswrapper[4676]: I1204 15:50:00.227189 4676 scope.go:117] "RemoveContainer" containerID="c7400cddab5773a1ef1b0b5b07a00195620e9e0bb5906b8e89c03029e7620bef" Dec 04 15:50:00 crc kubenswrapper[4676]: I1204 15:50:00.264929 4676 scope.go:117] "RemoveContainer" containerID="11c57be9a216605a8eb9cf338f53e60890a4725eff3cc5faa4e8d4b71e23302d" Dec 04 15:50:00 crc kubenswrapper[4676]: I1204 15:50:00.316753 4676 scope.go:117] "RemoveContainer" containerID="9ac6a9b70e7cb8225f2fff4e9dcf7c078f8b53f35739ef899a5b0e7928318e06" Dec 04 15:50:00 crc kubenswrapper[4676]: I1204 15:50:00.371663 4676 scope.go:117] "RemoveContainer" containerID="75ac6da838127e7ec899ecc0d54e089850e02abf2537c18b4c89930e36b0566a" Dec 04 15:50:00 crc kubenswrapper[4676]: I1204 15:50:00.408420 4676 scope.go:117] "RemoveContainer" containerID="968e54280a93060ccd7017e5c8d8dc4184f1217ca82c03089b0584c7098f8efa" Dec 04 15:50:00 crc kubenswrapper[4676]: I1204 15:50:00.449233 4676 scope.go:117] "RemoveContainer" containerID="b504620a44fd59ed7cfe1f1bb615ebcba66a9b4bce009c831026bbd1d75d22ad" Dec 04 15:50:00 crc kubenswrapper[4676]: I1204 15:50:00.516711 4676 scope.go:117] "RemoveContainer" containerID="bc4865c331287eaeefe44663a1a8a1cf9db6740287d27940ab743e1f0e51b2b3" Dec 04 15:50:03 crc kubenswrapper[4676]: I1204 15:50:03.407669 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:50:03 crc kubenswrapper[4676]: E1204 15:50:03.408440 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:50:17 crc kubenswrapper[4676]: I1204 15:50:17.040966 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-nnr52"] Dec 04 15:50:17 crc kubenswrapper[4676]: I1204 15:50:17.051829 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-nnr52"] Dec 04 15:50:17 crc kubenswrapper[4676]: I1204 15:50:17.385024 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:50:17 crc kubenswrapper[4676]: I1204 15:50:17.398777 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8534e22-ee3e-4b6c-92a8-1790b69f335d" path="/var/lib/kubelet/pods/c8534e22-ee3e-4b6c-92a8-1790b69f335d/volumes" Dec 04 15:50:17 crc kubenswrapper[4676]: I1204 15:50:17.833231 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"1bb4cd7ae05676babbbdcc2cd3ff8f1dd10eab8b768507ef7fd8ae94ee7c2991"} Dec 04 15:50:33 crc kubenswrapper[4676]: I1204 15:50:33.056740 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mxcxz"] Dec 04 15:50:33 crc kubenswrapper[4676]: I1204 15:50:33.073965 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mxcxz"] Dec 04 15:50:33 crc kubenswrapper[4676]: I1204 15:50:33.400002 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eaff04d-0c2d-4de6-ae7d-e0da6a64f997" path="/var/lib/kubelet/pods/1eaff04d-0c2d-4de6-ae7d-e0da6a64f997/volumes" Dec 04 15:50:34 crc kubenswrapper[4676]: I1204 15:50:34.037495 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pksjc"] Dec 04 15:50:34 crc kubenswrapper[4676]: I1204 15:50:34.049378 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pksjc"] Dec 04 15:50:35 crc kubenswrapper[4676]: I1204 15:50:35.397315 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac7518d-e354-42a9-85e4-766e455bf838" path="/var/lib/kubelet/pods/3ac7518d-e354-42a9-85e4-766e455bf838/volumes" Dec 04 15:51:00 crc kubenswrapper[4676]: I1204 15:51:00.671746 4676 scope.go:117] "RemoveContainer" containerID="d1f4f8e5e1f465b90a63581e1555bf9447784bf91a9c5d224acf43b302f36460" Dec 04 15:51:00 crc kubenswrapper[4676]: I1204 15:51:00.718621 4676 scope.go:117] "RemoveContainer" containerID="4c5f5c531c8768d6c4f1b6ff429a5e703561b00edafe069c4fb0c705f96d59cc" Dec 04 15:51:00 crc kubenswrapper[4676]: I1204 15:51:00.763916 4676 scope.go:117] "RemoveContainer" containerID="3824cedf3821404ecaa93361a03f6ca90e326fcb663133d0b9765ae49aef9e60" Dec 04 15:51:01 crc kubenswrapper[4676]: I1204 15:51:01.523253 4676 generic.go:334] "Generic (PLEG): container finished" podID="59ed14d8-9b88-49e8-ac61-213b3a6908e7" containerID="48f5085074710e598cc4226190de748eaf5b8cedf864b773b891868ed506e855" exitCode=0 Dec 04 15:51:01 crc kubenswrapper[4676]: I1204 15:51:01.523293 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" event={"ID":"59ed14d8-9b88-49e8-ac61-213b3a6908e7","Type":"ContainerDied","Data":"48f5085074710e598cc4226190de748eaf5b8cedf864b773b891868ed506e855"} Dec 04 15:51:02 crc kubenswrapper[4676]: I1204 15:51:02.988772 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.090736 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-ssh-key\") pod \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.090885 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-inventory\") pod \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.090994 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrt5v\" (UniqueName: \"kubernetes.io/projected/59ed14d8-9b88-49e8-ac61-213b3a6908e7-kube-api-access-vrt5v\") pod \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\" (UID: \"59ed14d8-9b88-49e8-ac61-213b3a6908e7\") " Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.096968 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ed14d8-9b88-49e8-ac61-213b3a6908e7-kube-api-access-vrt5v" (OuterVolumeSpecName: "kube-api-access-vrt5v") pod "59ed14d8-9b88-49e8-ac61-213b3a6908e7" (UID: "59ed14d8-9b88-49e8-ac61-213b3a6908e7"). InnerVolumeSpecName "kube-api-access-vrt5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.123808 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-inventory" (OuterVolumeSpecName: "inventory") pod "59ed14d8-9b88-49e8-ac61-213b3a6908e7" (UID: "59ed14d8-9b88-49e8-ac61-213b3a6908e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.126245 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "59ed14d8-9b88-49e8-ac61-213b3a6908e7" (UID: "59ed14d8-9b88-49e8-ac61-213b3a6908e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.208420 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.208615 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/59ed14d8-9b88-49e8-ac61-213b3a6908e7-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.208677 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrt5v\" (UniqueName: \"kubernetes.io/projected/59ed14d8-9b88-49e8-ac61-213b3a6908e7-kube-api-access-vrt5v\") on node \"crc\" DevicePath \"\"" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.599423 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" event={"ID":"59ed14d8-9b88-49e8-ac61-213b3a6908e7","Type":"ContainerDied","Data":"f8e22d4e9b8a4fcff06efa5811c449e8b8b2516bc1d56db833b531edae5f8c07"} Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.599497 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8e22d4e9b8a4fcff06efa5811c449e8b8b2516bc1d56db833b531edae5f8c07" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.599530 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.700678 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx"] Dec 04 15:51:03 crc kubenswrapper[4676]: E1204 15:51:03.701315 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ed14d8-9b88-49e8-ac61-213b3a6908e7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.701341 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ed14d8-9b88-49e8-ac61-213b3a6908e7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.701616 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ed14d8-9b88-49e8-ac61-213b3a6908e7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.702588 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.708921 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.710248 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.710454 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx"] Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.710563 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.710773 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.825178 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.825250 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.825350 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7f87\" (UniqueName: \"kubernetes.io/projected/1ada8c79-9112-4e01-9e1f-0289338b6191-kube-api-access-g7f87\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.927372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.927450 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.927670 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7f87\" (UniqueName: \"kubernetes.io/projected/1ada8c79-9112-4e01-9e1f-0289338b6191-kube-api-access-g7f87\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:03 crc kubenswrapper[4676]: I1204 15:51:03.932642 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:04 crc kubenswrapper[4676]: I1204 15:51:04.023819 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:04 crc kubenswrapper[4676]: I1204 15:51:04.031438 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7f87\" (UniqueName: \"kubernetes.io/projected/1ada8c79-9112-4e01-9e1f-0289338b6191-kube-api-access-g7f87\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-htrzx\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:04 crc kubenswrapper[4676]: I1204 15:51:04.325426 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:51:04 crc kubenswrapper[4676]: I1204 15:51:04.858741 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 15:51:04 crc kubenswrapper[4676]: I1204 15:51:04.860858 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx"] Dec 04 15:51:05 crc kubenswrapper[4676]: I1204 15:51:05.620280 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" event={"ID":"1ada8c79-9112-4e01-9e1f-0289338b6191","Type":"ContainerStarted","Data":"4d1f2b17d534ca30aa18c82200ef616261139ebd82be2c39745dc3cc778ba9ca"} Dec 04 15:51:06 crc kubenswrapper[4676]: I1204 15:51:06.633497 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" event={"ID":"1ada8c79-9112-4e01-9e1f-0289338b6191","Type":"ContainerStarted","Data":"e4bc498886ae1b182e582a5a32838eb215d0ec3a1928a18fba526cfc19466676"} Dec 04 15:51:06 crc kubenswrapper[4676]: I1204 15:51:06.662614 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" podStartSLOduration=2.988847879 podStartE2EDuration="3.66257357s" podCreationTimestamp="2025-12-04 15:51:03 +0000 UTC" firstStartedPulling="2025-12-04 15:51:04.858383683 +0000 UTC m=+1872.293053550" lastFinishedPulling="2025-12-04 15:51:05.532109384 +0000 UTC m=+1872.966779241" observedRunningTime="2025-12-04 15:51:06.654262751 +0000 UTC m=+1874.088932628" watchObservedRunningTime="2025-12-04 15:51:06.66257357 +0000 UTC m=+1874.097243437" Dec 04 15:51:21 crc kubenswrapper[4676]: I1204 15:51:21.043167 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-29dqv"] Dec 04 15:51:21 crc kubenswrapper[4676]: I1204 15:51:21.052947 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9rfzz"] Dec 04 15:51:21 crc kubenswrapper[4676]: I1204 15:51:21.062494 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9rfzz"] Dec 04 15:51:21 crc kubenswrapper[4676]: I1204 15:51:21.071550 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-29dqv"] Dec 04 15:51:21 crc kubenswrapper[4676]: I1204 15:51:21.395211 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925dde31-bb8c-4306-9dc9-5a7119e33f4e" path="/var/lib/kubelet/pods/925dde31-bb8c-4306-9dc9-5a7119e33f4e/volumes" Dec 04 15:51:21 crc kubenswrapper[4676]: I1204 15:51:21.396106 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2e276b-e6c3-4302-a9d8-b63830394431" path="/var/lib/kubelet/pods/ad2e276b-e6c3-4302-a9d8-b63830394431/volumes" Dec 04 15:51:22 crc kubenswrapper[4676]: I1204 15:51:22.026634 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-snxtt"] Dec 04 15:51:22 crc kubenswrapper[4676]: I1204 15:51:22.036070 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-snxtt"] Dec 04 15:51:23 crc kubenswrapper[4676]: I1204 15:51:23.396249 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6646cc-68b8-4672-be21-58ad781dd616" path="/var/lib/kubelet/pods/1a6646cc-68b8-4672-be21-58ad781dd616/volumes" Dec 04 15:51:40 crc kubenswrapper[4676]: I1204 15:51:40.045448 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-3fd5-account-create-sgtfn"] Dec 04 15:51:40 crc kubenswrapper[4676]: I1204 15:51:40.055670 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-3fd5-account-create-sgtfn"] Dec 04 15:51:41 crc kubenswrapper[4676]: I1204 15:51:41.030854 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2c40-account-create-4hc7f"] Dec 04 15:51:41 crc kubenswrapper[4676]: I1204 15:51:41.040179 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7e73-account-create-57czj"] Dec 04 15:51:41 crc kubenswrapper[4676]: I1204 15:51:41.048426 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2c40-account-create-4hc7f"] Dec 04 15:51:41 crc kubenswrapper[4676]: I1204 15:51:41.057381 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7e73-account-create-57czj"] Dec 04 15:51:41 crc kubenswrapper[4676]: I1204 15:51:41.395179 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4206086a-944c-4c86-8e9c-1b4c9272c70d" path="/var/lib/kubelet/pods/4206086a-944c-4c86-8e9c-1b4c9272c70d/volumes" Dec 04 15:51:41 crc kubenswrapper[4676]: I1204 15:51:41.395791 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b817005-97d2-4e1c-9363-15d8d0810d35" path="/var/lib/kubelet/pods/6b817005-97d2-4e1c-9363-15d8d0810d35/volumes" Dec 04 15:51:41 crc kubenswrapper[4676]: I1204 15:51:41.396448 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b49237c-6903-4b6d-b833-4cebfa620ffd" path="/var/lib/kubelet/pods/8b49237c-6903-4b6d-b833-4cebfa620ffd/volumes" Dec 04 15:52:00 crc kubenswrapper[4676]: I1204 15:52:00.906551 4676 scope.go:117] "RemoveContainer" containerID="7c3b3b07cdc851357180f211f0d2ecddd8d41d8a0e7435837731585a4e49732b" Dec 04 15:52:00 crc kubenswrapper[4676]: I1204 15:52:00.936172 4676 scope.go:117] "RemoveContainer" containerID="19d298dea519a8bf7a1508e45ee4948d1ed84a75b7913e584e1a03a075c3d376" Dec 04 15:52:00 crc kubenswrapper[4676]: I1204 15:52:00.982647 4676 scope.go:117] "RemoveContainer" containerID="5c6a8ef93376487ba0e290a902147d5cf2bff7af2601cb4ae1373029517c4c6c" Dec 04 15:52:01 crc kubenswrapper[4676]: I1204 15:52:01.029367 4676 scope.go:117] "RemoveContainer" containerID="8288f34a40b8a3e176fd70f0e56ba568112913b863244d7378ed50cf20d7710c" Dec 04 15:52:01 crc kubenswrapper[4676]: I1204 15:52:01.081468 4676 scope.go:117] "RemoveContainer" containerID="c659b482b9b8fc05646417b89b5d822f64ffc5723b9f82533e46a3abb4b09cde" Dec 04 15:52:01 crc kubenswrapper[4676]: I1204 15:52:01.138284 4676 scope.go:117] "RemoveContainer" containerID="ef29697bd1b0e438405594ba443129f4c775c39c3b663c74a5dcf8386f71f4a5" Dec 04 15:52:06 crc kubenswrapper[4676]: I1204 15:52:06.049014 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lhqd6"] Dec 04 15:52:06 crc kubenswrapper[4676]: I1204 15:52:06.061103 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lhqd6"] Dec 04 15:52:07 crc kubenswrapper[4676]: I1204 15:52:07.396296 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dea9144-3173-4ad8-ab2a-d44cd0215507" path="/var/lib/kubelet/pods/9dea9144-3173-4ad8-ab2a-d44cd0215507/volumes" Dec 04 15:52:27 crc kubenswrapper[4676]: I1204 15:52:27.079520 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvkng"] Dec 04 15:52:27 crc kubenswrapper[4676]: I1204 15:52:27.087977 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mvkng"] Dec 04 15:52:27 crc kubenswrapper[4676]: I1204 15:52:27.444357 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654b6ea4-eb07-4074-a7ba-d743b87f6489" path="/var/lib/kubelet/pods/654b6ea4-eb07-4074-a7ba-d743b87f6489/volumes" Dec 04 15:52:27 crc kubenswrapper[4676]: I1204 15:52:27.579824 4676 generic.go:334] "Generic (PLEG): container finished" podID="1ada8c79-9112-4e01-9e1f-0289338b6191" containerID="e4bc498886ae1b182e582a5a32838eb215d0ec3a1928a18fba526cfc19466676" exitCode=0 Dec 04 15:52:27 crc kubenswrapper[4676]: I1204 15:52:27.579879 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" event={"ID":"1ada8c79-9112-4e01-9e1f-0289338b6191","Type":"ContainerDied","Data":"e4bc498886ae1b182e582a5a32838eb215d0ec3a1928a18fba526cfc19466676"} Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.033770 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.199080 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7f87\" (UniqueName: \"kubernetes.io/projected/1ada8c79-9112-4e01-9e1f-0289338b6191-kube-api-access-g7f87\") pod \"1ada8c79-9112-4e01-9e1f-0289338b6191\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.199540 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-ssh-key\") pod \"1ada8c79-9112-4e01-9e1f-0289338b6191\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.199627 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory\") pod \"1ada8c79-9112-4e01-9e1f-0289338b6191\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.205055 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ada8c79-9112-4e01-9e1f-0289338b6191-kube-api-access-g7f87" (OuterVolumeSpecName: "kube-api-access-g7f87") pod "1ada8c79-9112-4e01-9e1f-0289338b6191" (UID: "1ada8c79-9112-4e01-9e1f-0289338b6191"). InnerVolumeSpecName "kube-api-access-g7f87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:52:29 crc kubenswrapper[4676]: E1204 15:52:29.228251 4676 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory podName:1ada8c79-9112-4e01-9e1f-0289338b6191 nodeName:}" failed. No retries permitted until 2025-12-04 15:52:29.728212636 +0000 UTC m=+1957.162882493 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory") pod "1ada8c79-9112-4e01-9e1f-0289338b6191" (UID: "1ada8c79-9112-4e01-9e1f-0289338b6191") : error deleting /var/lib/kubelet/pods/1ada8c79-9112-4e01-9e1f-0289338b6191/volume-subpaths: remove /var/lib/kubelet/pods/1ada8c79-9112-4e01-9e1f-0289338b6191/volume-subpaths: no such file or directory Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.234038 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ada8c79-9112-4e01-9e1f-0289338b6191" (UID: "1ada8c79-9112-4e01-9e1f-0289338b6191"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.301891 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.301955 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7f87\" (UniqueName: \"kubernetes.io/projected/1ada8c79-9112-4e01-9e1f-0289338b6191-kube-api-access-g7f87\") on node \"crc\" DevicePath \"\"" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.599329 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" event={"ID":"1ada8c79-9112-4e01-9e1f-0289338b6191","Type":"ContainerDied","Data":"4d1f2b17d534ca30aa18c82200ef616261139ebd82be2c39745dc3cc778ba9ca"} Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.599384 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1f2b17d534ca30aa18c82200ef616261139ebd82be2c39745dc3cc778ba9ca" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.599407 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-htrzx" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.694656 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd"] Dec 04 15:52:29 crc kubenswrapper[4676]: E1204 15:52:29.695267 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ada8c79-9112-4e01-9e1f-0289338b6191" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.695308 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ada8c79-9112-4e01-9e1f-0289338b6191" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.695657 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ada8c79-9112-4e01-9e1f-0289338b6191" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.697150 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.704383 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd"] Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.711297 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqrl\" (UniqueName: \"kubernetes.io/projected/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-kube-api-access-bdqrl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.711658 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.711697 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.813140 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory\") pod \"1ada8c79-9112-4e01-9e1f-0289338b6191\" (UID: \"1ada8c79-9112-4e01-9e1f-0289338b6191\") " Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.813359 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.813397 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.813785 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqrl\" (UniqueName: \"kubernetes.io/projected/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-kube-api-access-bdqrl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.818607 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory" (OuterVolumeSpecName: "inventory") pod "1ada8c79-9112-4e01-9e1f-0289338b6191" (UID: "1ada8c79-9112-4e01-9e1f-0289338b6191"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.819272 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.831213 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.831399 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqrl\" (UniqueName: \"kubernetes.io/projected/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-kube-api-access-bdqrl\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:29 crc kubenswrapper[4676]: I1204 15:52:29.915630 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ada8c79-9112-4e01-9e1f-0289338b6191-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:52:30 crc kubenswrapper[4676]: I1204 15:52:30.014024 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:30 crc kubenswrapper[4676]: I1204 15:52:30.090880 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8b49f"] Dec 04 15:52:30 crc kubenswrapper[4676]: I1204 15:52:30.101254 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-8b49f"] Dec 04 15:52:30 crc kubenswrapper[4676]: I1204 15:52:30.580065 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd"] Dec 04 15:52:30 crc kubenswrapper[4676]: I1204 15:52:30.612458 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" event={"ID":"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca","Type":"ContainerStarted","Data":"d89b146eacb61bc69b46b19a114a3cf6fd6fb319c1c65580a95da442ea8b6a78"} Dec 04 15:52:31 crc kubenswrapper[4676]: I1204 15:52:31.399539 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="282e9515-3aa8-49a9-a752-253d7cdf6b9f" path="/var/lib/kubelet/pods/282e9515-3aa8-49a9-a752-253d7cdf6b9f/volumes" Dec 04 15:52:32 crc kubenswrapper[4676]: I1204 15:52:32.630287 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" event={"ID":"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca","Type":"ContainerStarted","Data":"1fa451868de13df0c6ed99eb38ade538f3898d180cec354e0efd4d5da4adf03e"} Dec 04 15:52:32 crc kubenswrapper[4676]: I1204 15:52:32.652423 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" podStartSLOduration=2.900504076 podStartE2EDuration="3.652390213s" podCreationTimestamp="2025-12-04 15:52:29 +0000 UTC" firstStartedPulling="2025-12-04 15:52:30.580929003 +0000 UTC m=+1958.015598860" lastFinishedPulling="2025-12-04 15:52:31.33281514 +0000 UTC m=+1958.767484997" observedRunningTime="2025-12-04 15:52:32.649593103 +0000 UTC m=+1960.084262960" watchObservedRunningTime="2025-12-04 15:52:32.652390213 +0000 UTC m=+1960.087060070" Dec 04 15:52:37 crc kubenswrapper[4676]: I1204 15:52:37.681235 4676 generic.go:334] "Generic (PLEG): container finished" podID="c2ce3b93-6fd0-432c-8f42-99cc96bd0aca" containerID="1fa451868de13df0c6ed99eb38ade538f3898d180cec354e0efd4d5da4adf03e" exitCode=0 Dec 04 15:52:37 crc kubenswrapper[4676]: I1204 15:52:37.681436 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" event={"ID":"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca","Type":"ContainerDied","Data":"1fa451868de13df0c6ed99eb38ade538f3898d180cec354e0efd4d5da4adf03e"} Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.202167 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.363950 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqrl\" (UniqueName: \"kubernetes.io/projected/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-kube-api-access-bdqrl\") pod \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.364256 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-inventory\") pod \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.364348 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-ssh-key\") pod \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\" (UID: \"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca\") " Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.378648 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-kube-api-access-bdqrl" (OuterVolumeSpecName: "kube-api-access-bdqrl") pod "c2ce3b93-6fd0-432c-8f42-99cc96bd0aca" (UID: "c2ce3b93-6fd0-432c-8f42-99cc96bd0aca"). InnerVolumeSpecName "kube-api-access-bdqrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.400207 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c2ce3b93-6fd0-432c-8f42-99cc96bd0aca" (UID: "c2ce3b93-6fd0-432c-8f42-99cc96bd0aca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.402377 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-inventory" (OuterVolumeSpecName: "inventory") pod "c2ce3b93-6fd0-432c-8f42-99cc96bd0aca" (UID: "c2ce3b93-6fd0-432c-8f42-99cc96bd0aca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.467529 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.467560 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqrl\" (UniqueName: \"kubernetes.io/projected/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-kube-api-access-bdqrl\") on node \"crc\" DevicePath \"\"" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.467571 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2ce3b93-6fd0-432c-8f42-99cc96bd0aca-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.772342 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" event={"ID":"c2ce3b93-6fd0-432c-8f42-99cc96bd0aca","Type":"ContainerDied","Data":"d89b146eacb61bc69b46b19a114a3cf6fd6fb319c1c65580a95da442ea8b6a78"} Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.772394 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d89b146eacb61bc69b46b19a114a3cf6fd6fb319c1c65580a95da442ea8b6a78" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.772454 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.825521 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t"] Dec 04 15:52:39 crc kubenswrapper[4676]: E1204 15:52:39.826038 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ce3b93-6fd0-432c-8f42-99cc96bd0aca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.826061 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ce3b93-6fd0-432c-8f42-99cc96bd0aca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.826291 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ce3b93-6fd0-432c-8f42-99cc96bd0aca" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.827124 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.832502 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.832615 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.832794 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.832962 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.836820 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t"] Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.974460 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzc7\" (UniqueName: \"kubernetes.io/projected/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-kube-api-access-rdzc7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.975524 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:39 crc kubenswrapper[4676]: I1204 15:52:39.975723 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.077221 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.077331 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.077419 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzc7\" (UniqueName: \"kubernetes.io/projected/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-kube-api-access-rdzc7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.082880 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.083655 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.098027 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzc7\" (UniqueName: \"kubernetes.io/projected/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-kube-api-access-rdzc7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-qtl7t\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.149286 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.703428 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t"] Dec 04 15:52:40 crc kubenswrapper[4676]: I1204 15:52:40.783642 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" event={"ID":"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee","Type":"ContainerStarted","Data":"4820f0cf73b7e78c9f8f9b4cfdd0cd57cbf89c95d9f135954c59bbabc9580762"} Dec 04 15:52:41 crc kubenswrapper[4676]: I1204 15:52:41.793387 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" event={"ID":"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee","Type":"ContainerStarted","Data":"463e0949593aef95cf28e6619069e80fb00af0cf14aeca6c6800fc84eb6352ed"} Dec 04 15:52:41 crc kubenswrapper[4676]: I1204 15:52:41.811425 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" podStartSLOduration=2.353867982 podStartE2EDuration="2.811405458s" podCreationTimestamp="2025-12-04 15:52:39 +0000 UTC" firstStartedPulling="2025-12-04 15:52:40.710680393 +0000 UTC m=+1968.145350250" lastFinishedPulling="2025-12-04 15:52:41.168217869 +0000 UTC m=+1968.602887726" observedRunningTime="2025-12-04 15:52:41.808388931 +0000 UTC m=+1969.243058788" watchObservedRunningTime="2025-12-04 15:52:41.811405458 +0000 UTC m=+1969.246075305" Dec 04 15:52:46 crc kubenswrapper[4676]: I1204 15:52:46.026641 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:52:46 crc kubenswrapper[4676]: I1204 15:52:46.027164 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:53:01 crc kubenswrapper[4676]: I1204 15:53:01.293051 4676 scope.go:117] "RemoveContainer" containerID="ace2ef9a22a2171efb1772e651ff254eca28043d331a6a0802a7a96c7ef94df2" Dec 04 15:53:01 crc kubenswrapper[4676]: I1204 15:53:01.351695 4676 scope.go:117] "RemoveContainer" containerID="40a9ab17647fbd3e6a32f5508d34aeafa1667d10a701981925d5b888ca267998" Dec 04 15:53:01 crc kubenswrapper[4676]: I1204 15:53:01.404079 4676 scope.go:117] "RemoveContainer" containerID="349c81e6cadc8dbee6c218cb89424578435b0f2e4e587da0b9a04a2e0fc8eeb3" Dec 04 15:53:10 crc kubenswrapper[4676]: I1204 15:53:10.225877 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-wfvln"] Dec 04 15:53:10 crc kubenswrapper[4676]: I1204 15:53:10.235929 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-wfvln"] Dec 04 15:53:11 crc kubenswrapper[4676]: I1204 15:53:11.399320 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add0c0ae-e35b-47c2-b4f3-15af24cd97bf" path="/var/lib/kubelet/pods/add0c0ae-e35b-47c2-b4f3-15af24cd97bf/volumes" Dec 04 15:53:16 crc kubenswrapper[4676]: I1204 15:53:16.027225 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:53:16 crc kubenswrapper[4676]: I1204 15:53:16.027875 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:53:22 crc kubenswrapper[4676]: I1204 15:53:22.443505 4676 generic.go:334] "Generic (PLEG): container finished" podID="dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee" containerID="463e0949593aef95cf28e6619069e80fb00af0cf14aeca6c6800fc84eb6352ed" exitCode=0 Dec 04 15:53:22 crc kubenswrapper[4676]: I1204 15:53:22.443618 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" event={"ID":"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee","Type":"ContainerDied","Data":"463e0949593aef95cf28e6619069e80fb00af0cf14aeca6c6800fc84eb6352ed"} Dec 04 15:53:23 crc kubenswrapper[4676]: I1204 15:53:23.888407 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:53:23 crc kubenswrapper[4676]: I1204 15:53:23.959685 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-ssh-key\") pod \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " Dec 04 15:53:23 crc kubenswrapper[4676]: I1204 15:53:23.960268 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdzc7\" (UniqueName: \"kubernetes.io/projected/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-kube-api-access-rdzc7\") pod \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " Dec 04 15:53:23 crc kubenswrapper[4676]: I1204 15:53:23.960321 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-inventory\") pod \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\" (UID: \"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee\") " Dec 04 15:53:23 crc kubenswrapper[4676]: I1204 15:53:23.966540 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-kube-api-access-rdzc7" (OuterVolumeSpecName: "kube-api-access-rdzc7") pod "dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee" (UID: "dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee"). InnerVolumeSpecName "kube-api-access-rdzc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:53:23 crc kubenswrapper[4676]: I1204 15:53:23.990076 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-inventory" (OuterVolumeSpecName: "inventory") pod "dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee" (UID: "dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:53:23 crc kubenswrapper[4676]: I1204 15:53:23.990536 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee" (UID: "dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.061841 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.061889 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdzc7\" (UniqueName: \"kubernetes.io/projected/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-kube-api-access-rdzc7\") on node \"crc\" DevicePath \"\"" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.061958 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.464990 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" event={"ID":"dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee","Type":"ContainerDied","Data":"4820f0cf73b7e78c9f8f9b4cfdd0cd57cbf89c95d9f135954c59bbabc9580762"} Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.465035 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4820f0cf73b7e78c9f8f9b4cfdd0cd57cbf89c95d9f135954c59bbabc9580762" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.465091 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-qtl7t" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.568889 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph"] Dec 04 15:53:24 crc kubenswrapper[4676]: E1204 15:53:24.569405 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.569440 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.569715 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.570541 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.571584 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqr6s\" (UniqueName: \"kubernetes.io/projected/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-kube-api-access-bqr6s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.571838 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.572034 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.574309 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.574542 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.574636 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.574859 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.589084 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph"] Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.673134 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.673224 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.673302 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqr6s\" (UniqueName: \"kubernetes.io/projected/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-kube-api-access-bqr6s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.678361 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.678595 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.690947 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqr6s\" (UniqueName: \"kubernetes.io/projected/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-kube-api-access-bqr6s\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kmlph\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:24 crc kubenswrapper[4676]: I1204 15:53:24.886704 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:53:25 crc kubenswrapper[4676]: I1204 15:53:25.470875 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph"] Dec 04 15:53:26 crc kubenswrapper[4676]: I1204 15:53:26.486725 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" event={"ID":"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d","Type":"ContainerStarted","Data":"68e248d49f47bc5eec882d8f892991027a6dd89f953205f45b4c7439bf877a7a"} Dec 04 15:53:27 crc kubenswrapper[4676]: I1204 15:53:27.496487 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" event={"ID":"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d","Type":"ContainerStarted","Data":"8766a8dfa0973375c3dfa15e568624a1ae4358e72037d29816808f20f6775b9f"} Dec 04 15:53:27 crc kubenswrapper[4676]: I1204 15:53:27.518067 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" podStartSLOduration=1.8389616659999999 podStartE2EDuration="3.518023623s" podCreationTimestamp="2025-12-04 15:53:24 +0000 UTC" firstStartedPulling="2025-12-04 15:53:25.474611481 +0000 UTC m=+2012.909281338" lastFinishedPulling="2025-12-04 15:53:27.153673438 +0000 UTC m=+2014.588343295" observedRunningTime="2025-12-04 15:53:27.512310329 +0000 UTC m=+2014.946980186" watchObservedRunningTime="2025-12-04 15:53:27.518023623 +0000 UTC m=+2014.952693500" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.206206 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qhjg6"] Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.209041 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.217397 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhjg6"] Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.268596 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hgm2\" (UniqueName: \"kubernetes.io/projected/3911d80c-3e19-4fbf-ace6-752742bea61a-kube-api-access-8hgm2\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.268693 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3911d80c-3e19-4fbf-ace6-752742bea61a-utilities\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.268786 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3911d80c-3e19-4fbf-ace6-752742bea61a-catalog-content\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.370496 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hgm2\" (UniqueName: \"kubernetes.io/projected/3911d80c-3e19-4fbf-ace6-752742bea61a-kube-api-access-8hgm2\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.370565 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3911d80c-3e19-4fbf-ace6-752742bea61a-utilities\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.370639 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3911d80c-3e19-4fbf-ace6-752742bea61a-catalog-content\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.371224 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3911d80c-3e19-4fbf-ace6-752742bea61a-utilities\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.371269 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3911d80c-3e19-4fbf-ace6-752742bea61a-catalog-content\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.394987 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hgm2\" (UniqueName: \"kubernetes.io/projected/3911d80c-3e19-4fbf-ace6-752742bea61a-kube-api-access-8hgm2\") pod \"redhat-operators-qhjg6\" (UID: \"3911d80c-3e19-4fbf-ace6-752742bea61a\") " pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:41 crc kubenswrapper[4676]: I1204 15:53:41.545307 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:53:42 crc kubenswrapper[4676]: I1204 15:53:42.044849 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhjg6"] Dec 04 15:53:42 crc kubenswrapper[4676]: W1204 15:53:42.050723 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3911d80c_3e19_4fbf_ace6_752742bea61a.slice/crio-c6026a10175837817a53f8a25de644fff0187cdc301be50e73b1396e99705a2a WatchSource:0}: Error finding container c6026a10175837817a53f8a25de644fff0187cdc301be50e73b1396e99705a2a: Status 404 returned error can't find the container with id c6026a10175837817a53f8a25de644fff0187cdc301be50e73b1396e99705a2a Dec 04 15:53:42 crc kubenswrapper[4676]: I1204 15:53:42.850261 4676 generic.go:334] "Generic (PLEG): container finished" podID="3911d80c-3e19-4fbf-ace6-752742bea61a" containerID="641af20bd5bc65d8e170c0782220a9be46105e4a6b535313ce4eac93ad45eb14" exitCode=0 Dec 04 15:53:42 crc kubenswrapper[4676]: I1204 15:53:42.850467 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhjg6" event={"ID":"3911d80c-3e19-4fbf-ace6-752742bea61a","Type":"ContainerDied","Data":"641af20bd5bc65d8e170c0782220a9be46105e4a6b535313ce4eac93ad45eb14"} Dec 04 15:53:42 crc kubenswrapper[4676]: I1204 15:53:42.850567 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhjg6" event={"ID":"3911d80c-3e19-4fbf-ace6-752742bea61a","Type":"ContainerStarted","Data":"c6026a10175837817a53f8a25de644fff0187cdc301be50e73b1396e99705a2a"} Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.026370 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.026686 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.026729 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.027508 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bb4cd7ae05676babbbdcc2cd3ff8f1dd10eab8b768507ef7fd8ae94ee7c2991"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.027576 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://1bb4cd7ae05676babbbdcc2cd3ff8f1dd10eab8b768507ef7fd8ae94ee7c2991" gracePeriod=600 Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.891443 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="1bb4cd7ae05676babbbdcc2cd3ff8f1dd10eab8b768507ef7fd8ae94ee7c2991" exitCode=0 Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.891567 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"1bb4cd7ae05676babbbdcc2cd3ff8f1dd10eab8b768507ef7fd8ae94ee7c2991"} Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.892126 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640"} Dec 04 15:53:46 crc kubenswrapper[4676]: I1204 15:53:46.892245 4676 scope.go:117] "RemoveContainer" containerID="ffbb32dfc42191cded572f5ebd8321e77f8d1095701529dac0a77b6c969a2994" Dec 04 15:53:54 crc kubenswrapper[4676]: I1204 15:53:54.970706 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhjg6" event={"ID":"3911d80c-3e19-4fbf-ace6-752742bea61a","Type":"ContainerStarted","Data":"e0a0a4dba9af7191b1c5e234ec4140439ac37254714b02a0eefc3008f38b511c"} Dec 04 15:53:56 crc kubenswrapper[4676]: I1204 15:53:56.989854 4676 generic.go:334] "Generic (PLEG): container finished" podID="3911d80c-3e19-4fbf-ace6-752742bea61a" containerID="e0a0a4dba9af7191b1c5e234ec4140439ac37254714b02a0eefc3008f38b511c" exitCode=0 Dec 04 15:53:56 crc kubenswrapper[4676]: I1204 15:53:56.989986 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhjg6" event={"ID":"3911d80c-3e19-4fbf-ace6-752742bea61a","Type":"ContainerDied","Data":"e0a0a4dba9af7191b1c5e234ec4140439ac37254714b02a0eefc3008f38b511c"} Dec 04 15:53:58 crc kubenswrapper[4676]: I1204 15:53:58.036077 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhjg6" event={"ID":"3911d80c-3e19-4fbf-ace6-752742bea61a","Type":"ContainerStarted","Data":"2c1a010abd49496ed3c33ea1e0173c6dfbea3689ae83cf73923c789918eb17a5"} Dec 04 15:53:58 crc kubenswrapper[4676]: I1204 15:53:58.062379 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qhjg6" podStartSLOduration=2.535150713 podStartE2EDuration="17.062361006s" podCreationTimestamp="2025-12-04 15:53:41 +0000 UTC" firstStartedPulling="2025-12-04 15:53:42.852142188 +0000 UTC m=+2030.286812045" lastFinishedPulling="2025-12-04 15:53:57.379352481 +0000 UTC m=+2044.814022338" observedRunningTime="2025-12-04 15:53:58.055367835 +0000 UTC m=+2045.490037712" watchObservedRunningTime="2025-12-04 15:53:58.062361006 +0000 UTC m=+2045.497030863" Dec 04 15:54:01 crc kubenswrapper[4676]: I1204 15:54:01.542008 4676 scope.go:117] "RemoveContainer" containerID="e430f27031e3208c1416a3a4c8552d7a026ac0f1ec4c0f9d880cdd8d2a124fb5" Dec 04 15:54:01 crc kubenswrapper[4676]: I1204 15:54:01.546530 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:54:01 crc kubenswrapper[4676]: I1204 15:54:01.546578 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:54:02 crc kubenswrapper[4676]: I1204 15:54:02.591320 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhjg6" podUID="3911d80c-3e19-4fbf-ace6-752742bea61a" containerName="registry-server" probeResult="failure" output=< Dec 04 15:54:02 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 15:54:02 crc kubenswrapper[4676]: > Dec 04 15:54:11 crc kubenswrapper[4676]: I1204 15:54:11.597609 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:54:11 crc kubenswrapper[4676]: I1204 15:54:11.721967 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qhjg6" Dec 04 15:54:12 crc kubenswrapper[4676]: I1204 15:54:12.238230 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhjg6"] Dec 04 15:54:12 crc kubenswrapper[4676]: I1204 15:54:12.402239 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8dk8v"] Dec 04 15:54:12 crc kubenswrapper[4676]: I1204 15:54:12.402776 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8dk8v" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="registry-server" containerID="cri-o://810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2" gracePeriod=2 Dec 04 15:54:12 crc kubenswrapper[4676]: I1204 15:54:12.984092 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.174741 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-catalog-content\") pod \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.174796 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2k5l\" (UniqueName: \"kubernetes.io/projected/1d91ab6c-0b23-464e-a8d3-5be12c97971e-kube-api-access-h2k5l\") pod \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.175005 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-utilities\") pod \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\" (UID: \"1d91ab6c-0b23-464e-a8d3-5be12c97971e\") " Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.175989 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-utilities" (OuterVolumeSpecName: "utilities") pod "1d91ab6c-0b23-464e-a8d3-5be12c97971e" (UID: "1d91ab6c-0b23-464e-a8d3-5be12c97971e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.186127 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d91ab6c-0b23-464e-a8d3-5be12c97971e-kube-api-access-h2k5l" (OuterVolumeSpecName: "kube-api-access-h2k5l") pod "1d91ab6c-0b23-464e-a8d3-5be12c97971e" (UID: "1d91ab6c-0b23-464e-a8d3-5be12c97971e"). InnerVolumeSpecName "kube-api-access-h2k5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.192734 4676 generic.go:334] "Generic (PLEG): container finished" podID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerID="810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2" exitCode=0 Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.192807 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dk8v" event={"ID":"1d91ab6c-0b23-464e-a8d3-5be12c97971e","Type":"ContainerDied","Data":"810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2"} Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.192853 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dk8v" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.192879 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dk8v" event={"ID":"1d91ab6c-0b23-464e-a8d3-5be12c97971e","Type":"ContainerDied","Data":"a4a1de6848a5a846ffd7defb17d2717fb95d501f2496521b5cba44bcf64224a8"} Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.192961 4676 scope.go:117] "RemoveContainer" containerID="810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.257125 4676 scope.go:117] "RemoveContainer" containerID="8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.285066 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2k5l\" (UniqueName: \"kubernetes.io/projected/1d91ab6c-0b23-464e-a8d3-5be12c97971e-kube-api-access-h2k5l\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.285102 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.286373 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d91ab6c-0b23-464e-a8d3-5be12c97971e" (UID: "1d91ab6c-0b23-464e-a8d3-5be12c97971e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.312035 4676 scope.go:117] "RemoveContainer" containerID="30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.363061 4676 scope.go:117] "RemoveContainer" containerID="810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2" Dec 04 15:54:13 crc kubenswrapper[4676]: E1204 15:54:13.363609 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2\": container with ID starting with 810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2 not found: ID does not exist" containerID="810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.363653 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2"} err="failed to get container status \"810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2\": rpc error: code = NotFound desc = could not find container \"810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2\": container with ID starting with 810665b53ae4ac283af9e0fc5f3ab193d173afcabaadc2ec291a34259e4fc8c2 not found: ID does not exist" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.363701 4676 scope.go:117] "RemoveContainer" containerID="8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190" Dec 04 15:54:13 crc kubenswrapper[4676]: E1204 15:54:13.364986 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190\": container with ID starting with 8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190 not found: ID does not exist" containerID="8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.365019 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190"} err="failed to get container status \"8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190\": rpc error: code = NotFound desc = could not find container \"8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190\": container with ID starting with 8d953e9af491f06b68a0ad2663e5473b39c87061b484d4dad74c124c705b7190 not found: ID does not exist" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.365036 4676 scope.go:117] "RemoveContainer" containerID="30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668" Dec 04 15:54:13 crc kubenswrapper[4676]: E1204 15:54:13.365606 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668\": container with ID starting with 30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668 not found: ID does not exist" containerID="30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.365659 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668"} err="failed to get container status \"30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668\": rpc error: code = NotFound desc = could not find container \"30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668\": container with ID starting with 30be2cb3c7f963df675e54de9c51479c1893fcf3f1884b43066dac6a97cd7668 not found: ID does not exist" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.387450 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d91ab6c-0b23-464e-a8d3-5be12c97971e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.521667 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8dk8v"] Dec 04 15:54:13 crc kubenswrapper[4676]: I1204 15:54:13.529852 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8dk8v"] Dec 04 15:54:15 crc kubenswrapper[4676]: I1204 15:54:15.396536 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" path="/var/lib/kubelet/pods/1d91ab6c-0b23-464e-a8d3-5be12c97971e/volumes" Dec 04 15:54:24 crc kubenswrapper[4676]: I1204 15:54:24.310106 4676 generic.go:334] "Generic (PLEG): container finished" podID="fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d" containerID="8766a8dfa0973375c3dfa15e568624a1ae4358e72037d29816808f20f6775b9f" exitCode=0 Dec 04 15:54:24 crc kubenswrapper[4676]: I1204 15:54:24.310185 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" event={"ID":"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d","Type":"ContainerDied","Data":"8766a8dfa0973375c3dfa15e568624a1ae4358e72037d29816808f20f6775b9f"} Dec 04 15:54:25 crc kubenswrapper[4676]: I1204 15:54:25.782021 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:54:25 crc kubenswrapper[4676]: I1204 15:54:25.970048 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-inventory\") pod \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " Dec 04 15:54:25 crc kubenswrapper[4676]: I1204 15:54:25.970130 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqr6s\" (UniqueName: \"kubernetes.io/projected/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-kube-api-access-bqr6s\") pod \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " Dec 04 15:54:25 crc kubenswrapper[4676]: I1204 15:54:25.970318 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-ssh-key\") pod \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\" (UID: \"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d\") " Dec 04 15:54:25 crc kubenswrapper[4676]: I1204 15:54:25.975868 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-kube-api-access-bqr6s" (OuterVolumeSpecName: "kube-api-access-bqr6s") pod "fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d" (UID: "fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d"). InnerVolumeSpecName "kube-api-access-bqr6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:54:25 crc kubenswrapper[4676]: I1204 15:54:25.998592 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d" (UID: "fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.000548 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-inventory" (OuterVolumeSpecName: "inventory") pod "fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d" (UID: "fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.073351 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.073390 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.073407 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqr6s\" (UniqueName: \"kubernetes.io/projected/fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d-kube-api-access-bqr6s\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.342208 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" event={"ID":"fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d","Type":"ContainerDied","Data":"68e248d49f47bc5eec882d8f892991027a6dd89f953205f45b4c7439bf877a7a"} Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.342246 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e248d49f47bc5eec882d8f892991027a6dd89f953205f45b4c7439bf877a7a" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.342299 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kmlph" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.445022 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kvjcq"] Dec 04 15:54:26 crc kubenswrapper[4676]: E1204 15:54:26.445538 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="extract-utilities" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.445572 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="extract-utilities" Dec 04 15:54:26 crc kubenswrapper[4676]: E1204 15:54:26.445590 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="registry-server" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.445598 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="registry-server" Dec 04 15:54:26 crc kubenswrapper[4676]: E1204 15:54:26.445622 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.445633 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:54:26 crc kubenswrapper[4676]: E1204 15:54:26.445659 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="extract-content" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.445666 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="extract-content" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.445934 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.445955 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d91ab6c-0b23-464e-a8d3-5be12c97971e" containerName="registry-server" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.446659 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.451811 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.452133 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.452426 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.452583 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.460814 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kvjcq"] Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.582576 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.583029 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7m2t\" (UniqueName: \"kubernetes.io/projected/ed758cb2-028d-43a2-b04a-3b494673e6f6-kube-api-access-m7m2t\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.583068 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.686068 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.686188 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7m2t\" (UniqueName: \"kubernetes.io/projected/ed758cb2-028d-43a2-b04a-3b494673e6f6-kube-api-access-m7m2t\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.686232 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.690251 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.695501 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.705256 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7m2t\" (UniqueName: \"kubernetes.io/projected/ed758cb2-028d-43a2-b04a-3b494673e6f6-kube-api-access-m7m2t\") pod \"ssh-known-hosts-edpm-deployment-kvjcq\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:26 crc kubenswrapper[4676]: I1204 15:54:26.768759 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:27 crc kubenswrapper[4676]: I1204 15:54:27.303439 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-kvjcq"] Dec 04 15:54:27 crc kubenswrapper[4676]: I1204 15:54:27.353857 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" event={"ID":"ed758cb2-028d-43a2-b04a-3b494673e6f6","Type":"ContainerStarted","Data":"3c432f7a172910580c8e0e74e0b4f0366fd5f18224a020a62099b1e03108366d"} Dec 04 15:54:28 crc kubenswrapper[4676]: I1204 15:54:28.364498 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" event={"ID":"ed758cb2-028d-43a2-b04a-3b494673e6f6","Type":"ContainerStarted","Data":"e7bfade63a12d1cd0f9a3651c69b9cb5b19bf8324e935a81df7def2b216dfedb"} Dec 04 15:54:28 crc kubenswrapper[4676]: I1204 15:54:28.386483 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" podStartSLOduration=1.9109552779999999 podStartE2EDuration="2.386450621s" podCreationTimestamp="2025-12-04 15:54:26 +0000 UTC" firstStartedPulling="2025-12-04 15:54:27.30534888 +0000 UTC m=+2074.740018747" lastFinishedPulling="2025-12-04 15:54:27.780844233 +0000 UTC m=+2075.215514090" observedRunningTime="2025-12-04 15:54:28.380185001 +0000 UTC m=+2075.814854858" watchObservedRunningTime="2025-12-04 15:54:28.386450621 +0000 UTC m=+2075.821120468" Dec 04 15:54:35 crc kubenswrapper[4676]: I1204 15:54:35.438080 4676 generic.go:334] "Generic (PLEG): container finished" podID="ed758cb2-028d-43a2-b04a-3b494673e6f6" containerID="e7bfade63a12d1cd0f9a3651c69b9cb5b19bf8324e935a81df7def2b216dfedb" exitCode=0 Dec 04 15:54:35 crc kubenswrapper[4676]: I1204 15:54:35.439553 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" event={"ID":"ed758cb2-028d-43a2-b04a-3b494673e6f6","Type":"ContainerDied","Data":"e7bfade63a12d1cd0f9a3651c69b9cb5b19bf8324e935a81df7def2b216dfedb"} Dec 04 15:54:36 crc kubenswrapper[4676]: I1204 15:54:36.858621 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.001871 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-inventory-0\") pod \"ed758cb2-028d-43a2-b04a-3b494673e6f6\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.001964 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7m2t\" (UniqueName: \"kubernetes.io/projected/ed758cb2-028d-43a2-b04a-3b494673e6f6-kube-api-access-m7m2t\") pod \"ed758cb2-028d-43a2-b04a-3b494673e6f6\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.002086 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-ssh-key-openstack-edpm-ipam\") pod \"ed758cb2-028d-43a2-b04a-3b494673e6f6\" (UID: \"ed758cb2-028d-43a2-b04a-3b494673e6f6\") " Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.007179 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed758cb2-028d-43a2-b04a-3b494673e6f6-kube-api-access-m7m2t" (OuterVolumeSpecName: "kube-api-access-m7m2t") pod "ed758cb2-028d-43a2-b04a-3b494673e6f6" (UID: "ed758cb2-028d-43a2-b04a-3b494673e6f6"). InnerVolumeSpecName "kube-api-access-m7m2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.033071 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed758cb2-028d-43a2-b04a-3b494673e6f6" (UID: "ed758cb2-028d-43a2-b04a-3b494673e6f6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.034407 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ed758cb2-028d-43a2-b04a-3b494673e6f6" (UID: "ed758cb2-028d-43a2-b04a-3b494673e6f6"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.104321 4676 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.104366 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7m2t\" (UniqueName: \"kubernetes.io/projected/ed758cb2-028d-43a2-b04a-3b494673e6f6-kube-api-access-m7m2t\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.104382 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed758cb2-028d-43a2-b04a-3b494673e6f6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.466170 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" event={"ID":"ed758cb2-028d-43a2-b04a-3b494673e6f6","Type":"ContainerDied","Data":"3c432f7a172910580c8e0e74e0b4f0366fd5f18224a020a62099b1e03108366d"} Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.466220 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c432f7a172910580c8e0e74e0b4f0366fd5f18224a020a62099b1e03108366d" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.466290 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-kvjcq" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.523726 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h"] Dec 04 15:54:37 crc kubenswrapper[4676]: E1204 15:54:37.524209 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed758cb2-028d-43a2-b04a-3b494673e6f6" containerName="ssh-known-hosts-edpm-deployment" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.524231 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed758cb2-028d-43a2-b04a-3b494673e6f6" containerName="ssh-known-hosts-edpm-deployment" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.524437 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed758cb2-028d-43a2-b04a-3b494673e6f6" containerName="ssh-known-hosts-edpm-deployment" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.525176 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.527713 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.528164 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.528177 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.557770 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.597029 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h"] Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.615781 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q69r\" (UniqueName: \"kubernetes.io/projected/47048b08-8efe-4c2b-a449-bad99291721d-kube-api-access-9q69r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.615845 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.616059 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.718429 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.718569 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q69r\" (UniqueName: \"kubernetes.io/projected/47048b08-8efe-4c2b-a449-bad99291721d-kube-api-access-9q69r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.718620 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.722792 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.722795 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.734623 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q69r\" (UniqueName: \"kubernetes.io/projected/47048b08-8efe-4c2b-a449-bad99291721d-kube-api-access-9q69r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5bx5h\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:37 crc kubenswrapper[4676]: I1204 15:54:37.892872 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:38 crc kubenswrapper[4676]: I1204 15:54:38.417799 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h"] Dec 04 15:54:38 crc kubenswrapper[4676]: I1204 15:54:38.477476 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" event={"ID":"47048b08-8efe-4c2b-a449-bad99291721d","Type":"ContainerStarted","Data":"bb0377fe157376dfb7e07b2440a7d18317c752e466d356519481f36b080fdf67"} Dec 04 15:54:44 crc kubenswrapper[4676]: I1204 15:54:44.540237 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" event={"ID":"47048b08-8efe-4c2b-a449-bad99291721d","Type":"ContainerStarted","Data":"5d90f50f3814af682dd49259ecb06669063a270e8b60bf6b3ad16e6d6e73118b"} Dec 04 15:54:45 crc kubenswrapper[4676]: I1204 15:54:45.571595 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" podStartSLOduration=3.36621764 podStartE2EDuration="8.571325436s" podCreationTimestamp="2025-12-04 15:54:37 +0000 UTC" firstStartedPulling="2025-12-04 15:54:38.435649955 +0000 UTC m=+2085.870319802" lastFinishedPulling="2025-12-04 15:54:43.640757741 +0000 UTC m=+2091.075427598" observedRunningTime="2025-12-04 15:54:45.568642888 +0000 UTC m=+2093.003312755" watchObservedRunningTime="2025-12-04 15:54:45.571325436 +0000 UTC m=+2093.005995293" Dec 04 15:54:53 crc kubenswrapper[4676]: I1204 15:54:53.746985 4676 generic.go:334] "Generic (PLEG): container finished" podID="47048b08-8efe-4c2b-a449-bad99291721d" containerID="5d90f50f3814af682dd49259ecb06669063a270e8b60bf6b3ad16e6d6e73118b" exitCode=0 Dec 04 15:54:53 crc kubenswrapper[4676]: I1204 15:54:53.747089 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" event={"ID":"47048b08-8efe-4c2b-a449-bad99291721d","Type":"ContainerDied","Data":"5d90f50f3814af682dd49259ecb06669063a270e8b60bf6b3ad16e6d6e73118b"} Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.237579 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.347266 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-inventory\") pod \"47048b08-8efe-4c2b-a449-bad99291721d\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.347462 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q69r\" (UniqueName: \"kubernetes.io/projected/47048b08-8efe-4c2b-a449-bad99291721d-kube-api-access-9q69r\") pod \"47048b08-8efe-4c2b-a449-bad99291721d\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.347664 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-ssh-key\") pod \"47048b08-8efe-4c2b-a449-bad99291721d\" (UID: \"47048b08-8efe-4c2b-a449-bad99291721d\") " Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.358138 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47048b08-8efe-4c2b-a449-bad99291721d-kube-api-access-9q69r" (OuterVolumeSpecName: "kube-api-access-9q69r") pod "47048b08-8efe-4c2b-a449-bad99291721d" (UID: "47048b08-8efe-4c2b-a449-bad99291721d"). InnerVolumeSpecName "kube-api-access-9q69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.375472 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "47048b08-8efe-4c2b-a449-bad99291721d" (UID: "47048b08-8efe-4c2b-a449-bad99291721d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.375898 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-inventory" (OuterVolumeSpecName: "inventory") pod "47048b08-8efe-4c2b-a449-bad99291721d" (UID: "47048b08-8efe-4c2b-a449-bad99291721d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.455115 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.455163 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q69r\" (UniqueName: \"kubernetes.io/projected/47048b08-8efe-4c2b-a449-bad99291721d-kube-api-access-9q69r\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.455174 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/47048b08-8efe-4c2b-a449-bad99291721d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.769722 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" event={"ID":"47048b08-8efe-4c2b-a449-bad99291721d","Type":"ContainerDied","Data":"bb0377fe157376dfb7e07b2440a7d18317c752e466d356519481f36b080fdf67"} Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.770110 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb0377fe157376dfb7e07b2440a7d18317c752e466d356519481f36b080fdf67" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.769787 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5bx5h" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.850226 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb"] Dec 04 15:54:55 crc kubenswrapper[4676]: E1204 15:54:55.850695 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47048b08-8efe-4c2b-a449-bad99291721d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.850724 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="47048b08-8efe-4c2b-a449-bad99291721d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.851008 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="47048b08-8efe-4c2b-a449-bad99291721d" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.851758 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.854106 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.854295 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.854375 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.854581 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.867679 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb"] Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.965709 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.965868 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96rl\" (UniqueName: \"kubernetes.io/projected/17492632-88c9-4d92-9804-12228ba0fdad-kube-api-access-j96rl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:55 crc kubenswrapper[4676]: I1204 15:54:55.965922 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.067497 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.068261 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j96rl\" (UniqueName: \"kubernetes.io/projected/17492632-88c9-4d92-9804-12228ba0fdad-kube-api-access-j96rl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.068424 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.072651 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.073921 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.088596 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96rl\" (UniqueName: \"kubernetes.io/projected/17492632-88c9-4d92-9804-12228ba0fdad-kube-api-access-j96rl\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.172530 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:54:56 crc kubenswrapper[4676]: I1204 15:54:56.802251 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb"] Dec 04 15:54:57 crc kubenswrapper[4676]: I1204 15:54:57.796821 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" event={"ID":"17492632-88c9-4d92-9804-12228ba0fdad","Type":"ContainerStarted","Data":"e2fb6813b9bd257d31f066a04618d65cc0302c1a10739c91823dbb2fca8fd23d"} Dec 04 15:54:57 crc kubenswrapper[4676]: I1204 15:54:57.797146 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" event={"ID":"17492632-88c9-4d92-9804-12228ba0fdad","Type":"ContainerStarted","Data":"d746b1256ce2f52ec442f71e354d933f3b3ba72c37f81e3b3d97ffae0ee56ac3"} Dec 04 15:54:57 crc kubenswrapper[4676]: I1204 15:54:57.825978 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" podStartSLOduration=2.171242833 podStartE2EDuration="2.825955573s" podCreationTimestamp="2025-12-04 15:54:55 +0000 UTC" firstStartedPulling="2025-12-04 15:54:56.803246583 +0000 UTC m=+2104.237916450" lastFinishedPulling="2025-12-04 15:54:57.457959323 +0000 UTC m=+2104.892629190" observedRunningTime="2025-12-04 15:54:57.811432445 +0000 UTC m=+2105.246102312" watchObservedRunningTime="2025-12-04 15:54:57.825955573 +0000 UTC m=+2105.260625430" Dec 04 15:55:08 crc kubenswrapper[4676]: I1204 15:55:08.902628 4676 generic.go:334] "Generic (PLEG): container finished" podID="17492632-88c9-4d92-9804-12228ba0fdad" containerID="e2fb6813b9bd257d31f066a04618d65cc0302c1a10739c91823dbb2fca8fd23d" exitCode=0 Dec 04 15:55:08 crc kubenswrapper[4676]: I1204 15:55:08.902748 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" event={"ID":"17492632-88c9-4d92-9804-12228ba0fdad","Type":"ContainerDied","Data":"e2fb6813b9bd257d31f066a04618d65cc0302c1a10739c91823dbb2fca8fd23d"} Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.384757 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.423424 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-ssh-key\") pod \"17492632-88c9-4d92-9804-12228ba0fdad\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.423570 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j96rl\" (UniqueName: \"kubernetes.io/projected/17492632-88c9-4d92-9804-12228ba0fdad-kube-api-access-j96rl\") pod \"17492632-88c9-4d92-9804-12228ba0fdad\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.423648 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-inventory\") pod \"17492632-88c9-4d92-9804-12228ba0fdad\" (UID: \"17492632-88c9-4d92-9804-12228ba0fdad\") " Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.428772 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17492632-88c9-4d92-9804-12228ba0fdad-kube-api-access-j96rl" (OuterVolumeSpecName: "kube-api-access-j96rl") pod "17492632-88c9-4d92-9804-12228ba0fdad" (UID: "17492632-88c9-4d92-9804-12228ba0fdad"). InnerVolumeSpecName "kube-api-access-j96rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.454153 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "17492632-88c9-4d92-9804-12228ba0fdad" (UID: "17492632-88c9-4d92-9804-12228ba0fdad"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.455266 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-inventory" (OuterVolumeSpecName: "inventory") pod "17492632-88c9-4d92-9804-12228ba0fdad" (UID: "17492632-88c9-4d92-9804-12228ba0fdad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.526827 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.526861 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j96rl\" (UniqueName: \"kubernetes.io/projected/17492632-88c9-4d92-9804-12228ba0fdad-kube-api-access-j96rl\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.526877 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/17492632-88c9-4d92-9804-12228ba0fdad-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.936555 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" event={"ID":"17492632-88c9-4d92-9804-12228ba0fdad","Type":"ContainerDied","Data":"d746b1256ce2f52ec442f71e354d933f3b3ba72c37f81e3b3d97ffae0ee56ac3"} Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.936830 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d746b1256ce2f52ec442f71e354d933f3b3ba72c37f81e3b3d97ffae0ee56ac3" Dec 04 15:55:10 crc kubenswrapper[4676]: I1204 15:55:10.936653 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.022723 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw"] Dec 04 15:55:11 crc kubenswrapper[4676]: E1204 15:55:11.023417 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17492632-88c9-4d92-9804-12228ba0fdad" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.023449 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="17492632-88c9-4d92-9804-12228ba0fdad" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.023755 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="17492632-88c9-4d92-9804-12228ba0fdad" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.024929 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.028296 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.028681 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.028742 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.029152 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.029447 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.029730 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.029889 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.030666 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.032606 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw"] Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.137066 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2hbv\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-kube-api-access-w2hbv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.137440 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.137493 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.137641 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.137693 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.137749 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.137954 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.138150 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.138199 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.138500 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.138566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.138742 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.138806 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.138890 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240632 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240707 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240731 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240770 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240793 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240812 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240849 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2hbv\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-kube-api-access-w2hbv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240874 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.240974 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.241011 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.241033 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.241054 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.241100 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.245534 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.245594 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.245885 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.246111 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.246265 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.246789 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.247760 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.248341 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.248405 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.249472 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.249647 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.250326 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.250487 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.258798 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2hbv\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-kube-api-access-w2hbv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.342808 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:11 crc kubenswrapper[4676]: I1204 15:55:11.965584 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw"] Dec 04 15:55:12 crc kubenswrapper[4676]: I1204 15:55:12.987005 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" event={"ID":"314126ca-1837-48ba-a5b3-fa2b752ff6e6","Type":"ContainerStarted","Data":"f916ccc8b4d397d4c88e288603c52f9d001de846b1b35013b3abf23d11b54154"} Dec 04 15:55:12 crc kubenswrapper[4676]: I1204 15:55:12.987395 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" event={"ID":"314126ca-1837-48ba-a5b3-fa2b752ff6e6","Type":"ContainerStarted","Data":"2ea61f9411e286ec1eb89e1b69797d04e536699d7940096c99e8dda1e3040c3f"} Dec 04 15:55:13 crc kubenswrapper[4676]: I1204 15:55:13.023380 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" podStartSLOduration=2.6133105150000002 podStartE2EDuration="3.023358655s" podCreationTimestamp="2025-12-04 15:55:10 +0000 UTC" firstStartedPulling="2025-12-04 15:55:11.962563049 +0000 UTC m=+2119.397232906" lastFinishedPulling="2025-12-04 15:55:12.372611189 +0000 UTC m=+2119.807281046" observedRunningTime="2025-12-04 15:55:13.013576074 +0000 UTC m=+2120.448245931" watchObservedRunningTime="2025-12-04 15:55:13.023358655 +0000 UTC m=+2120.458028512" Dec 04 15:55:46 crc kubenswrapper[4676]: I1204 15:55:46.026807 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:55:46 crc kubenswrapper[4676]: I1204 15:55:46.027402 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:55:56 crc kubenswrapper[4676]: I1204 15:55:56.376437 4676 generic.go:334] "Generic (PLEG): container finished" podID="314126ca-1837-48ba-a5b3-fa2b752ff6e6" containerID="f916ccc8b4d397d4c88e288603c52f9d001de846b1b35013b3abf23d11b54154" exitCode=0 Dec 04 15:55:56 crc kubenswrapper[4676]: I1204 15:55:56.376964 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" event={"ID":"314126ca-1837-48ba-a5b3-fa2b752ff6e6","Type":"ContainerDied","Data":"f916ccc8b4d397d4c88e288603c52f9d001de846b1b35013b3abf23d11b54154"} Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.831058 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.932371 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.932515 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-nova-combined-ca-bundle\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.932539 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-telemetry-combined-ca-bundle\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.932573 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.932597 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-repo-setup-combined-ca-bundle\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.932625 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-inventory\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933519 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-bootstrap-combined-ca-bundle\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933544 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ovn-combined-ca-bundle\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933583 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933618 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-neutron-metadata-combined-ca-bundle\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933685 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ssh-key\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933729 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-libvirt-combined-ca-bundle\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933835 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2hbv\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-kube-api-access-w2hbv\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.933867 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\" (UID: \"314126ca-1837-48ba-a5b3-fa2b752ff6e6\") " Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.939297 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.939846 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.940046 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.940996 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.941405 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.941686 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.943154 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.943648 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.944038 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-kube-api-access-w2hbv" (OuterVolumeSpecName: "kube-api-access-w2hbv") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "kube-api-access-w2hbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.944711 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.945549 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.954395 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.968401 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-inventory" (OuterVolumeSpecName: "inventory") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:57 crc kubenswrapper[4676]: I1204 15:55:57.969919 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "314126ca-1837-48ba-a5b3-fa2b752ff6e6" (UID: "314126ca-1837-48ba-a5b3-fa2b752ff6e6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037050 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037115 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037133 4676 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037154 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037173 4676 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037219 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2hbv\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-kube-api-access-w2hbv\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037237 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037252 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037267 4676 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037279 4676 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037291 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/314126ca-1837-48ba-a5b3-fa2b752ff6e6-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037307 4676 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037320 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.037332 4676 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/314126ca-1837-48ba-a5b3-fa2b752ff6e6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.399192 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" event={"ID":"314126ca-1837-48ba-a5b3-fa2b752ff6e6","Type":"ContainerDied","Data":"2ea61f9411e286ec1eb89e1b69797d04e536699d7940096c99e8dda1e3040c3f"} Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.399249 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ea61f9411e286ec1eb89e1b69797d04e536699d7940096c99e8dda1e3040c3f" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.399244 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.594925 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s"] Dec 04 15:55:58 crc kubenswrapper[4676]: E1204 15:55:58.595865 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="314126ca-1837-48ba-a5b3-fa2b752ff6e6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.595902 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="314126ca-1837-48ba-a5b3-fa2b752ff6e6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.609612 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="314126ca-1837-48ba-a5b3-fa2b752ff6e6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.610712 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.617521 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.617768 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.617896 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.618049 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.620191 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.640113 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s"] Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.755722 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.755863 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x78jp\" (UniqueName: \"kubernetes.io/projected/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-kube-api-access-x78jp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.755934 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.756043 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.756136 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.857923 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.858020 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.858084 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.858181 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x78jp\" (UniqueName: \"kubernetes.io/projected/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-kube-api-access-x78jp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.858218 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.859239 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.864849 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.865501 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.867922 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.876826 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x78jp\" (UniqueName: \"kubernetes.io/projected/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-kube-api-access-x78jp\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-thh6s\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:58 crc kubenswrapper[4676]: I1204 15:55:58.938627 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:55:59 crc kubenswrapper[4676]: I1204 15:55:59.493980 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s"] Dec 04 15:55:59 crc kubenswrapper[4676]: W1204 15:55:59.497773 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5dbc42d_5e2f_4114_adf7_9bbf7ef7a041.slice/crio-50bd8393519ea2078a578eab64f5c3e9e005c482cedcaae176cf5561a2c1ba66 WatchSource:0}: Error finding container 50bd8393519ea2078a578eab64f5c3e9e005c482cedcaae176cf5561a2c1ba66: Status 404 returned error can't find the container with id 50bd8393519ea2078a578eab64f5c3e9e005c482cedcaae176cf5561a2c1ba66 Dec 04 15:56:00 crc kubenswrapper[4676]: I1204 15:56:00.418639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" event={"ID":"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041","Type":"ContainerStarted","Data":"a2615a72dcd9d4ccf01196ec43efa0a4c6ad4b4843628ad459a8852bc7c4c6d5"} Dec 04 15:56:00 crc kubenswrapper[4676]: I1204 15:56:00.418936 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" event={"ID":"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041","Type":"ContainerStarted","Data":"50bd8393519ea2078a578eab64f5c3e9e005c482cedcaae176cf5561a2c1ba66"} Dec 04 15:56:00 crc kubenswrapper[4676]: I1204 15:56:00.446412 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" podStartSLOduration=1.905653938 podStartE2EDuration="2.446371988s" podCreationTimestamp="2025-12-04 15:55:58 +0000 UTC" firstStartedPulling="2025-12-04 15:55:59.500794048 +0000 UTC m=+2166.935463905" lastFinishedPulling="2025-12-04 15:56:00.041512098 +0000 UTC m=+2167.476181955" observedRunningTime="2025-12-04 15:56:00.432931901 +0000 UTC m=+2167.867601758" watchObservedRunningTime="2025-12-04 15:56:00.446371988 +0000 UTC m=+2167.881041845" Dec 04 15:56:16 crc kubenswrapper[4676]: I1204 15:56:16.026806 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:56:16 crc kubenswrapper[4676]: I1204 15:56:16.027265 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.352490 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w2l8g"] Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.355191 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.391579 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2l8g"] Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.462403 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-utilities\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.462868 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-catalog-content\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.463040 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt6rf\" (UniqueName: \"kubernetes.io/projected/478d1294-182e-42cf-bd61-b4c92e849900-kube-api-access-rt6rf\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.565276 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-catalog-content\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.565357 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt6rf\" (UniqueName: \"kubernetes.io/projected/478d1294-182e-42cf-bd61-b4c92e849900-kube-api-access-rt6rf\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.565453 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-utilities\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.565952 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-utilities\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.565952 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-catalog-content\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.590853 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt6rf\" (UniqueName: \"kubernetes.io/projected/478d1294-182e-42cf-bd61-b4c92e849900-kube-api-access-rt6rf\") pod \"certified-operators-w2l8g\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:42 crc kubenswrapper[4676]: I1204 15:56:42.683365 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:43 crc kubenswrapper[4676]: I1204 15:56:43.326035 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2l8g"] Dec 04 15:56:43 crc kubenswrapper[4676]: I1204 15:56:43.923254 4676 generic.go:334] "Generic (PLEG): container finished" podID="478d1294-182e-42cf-bd61-b4c92e849900" containerID="a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73" exitCode=0 Dec 04 15:56:43 crc kubenswrapper[4676]: I1204 15:56:43.923293 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2l8g" event={"ID":"478d1294-182e-42cf-bd61-b4c92e849900","Type":"ContainerDied","Data":"a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73"} Dec 04 15:56:43 crc kubenswrapper[4676]: I1204 15:56:43.923538 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2l8g" event={"ID":"478d1294-182e-42cf-bd61-b4c92e849900","Type":"ContainerStarted","Data":"e1ae3bd0c3f6af4d0779bb829419fdc208c81177d02904b569d1aa8bccc60e7a"} Dec 04 15:56:43 crc kubenswrapper[4676]: I1204 15:56:43.928813 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 15:56:45 crc kubenswrapper[4676]: I1204 15:56:45.951335 4676 generic.go:334] "Generic (PLEG): container finished" podID="478d1294-182e-42cf-bd61-b4c92e849900" containerID="802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585" exitCode=0 Dec 04 15:56:45 crc kubenswrapper[4676]: I1204 15:56:45.951478 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2l8g" event={"ID":"478d1294-182e-42cf-bd61-b4c92e849900","Type":"ContainerDied","Data":"802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585"} Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.027415 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.027488 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.027540 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.028385 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.028475 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" gracePeriod=600 Dec 04 15:56:46 crc kubenswrapper[4676]: E1204 15:56:46.153349 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.966177 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" exitCode=0 Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.966263 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640"} Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.966599 4676 scope.go:117] "RemoveContainer" containerID="1bb4cd7ae05676babbbdcc2cd3ff8f1dd10eab8b768507ef7fd8ae94ee7c2991" Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.967598 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:56:46 crc kubenswrapper[4676]: E1204 15:56:46.968032 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:56:46 crc kubenswrapper[4676]: I1204 15:56:46.970339 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2l8g" event={"ID":"478d1294-182e-42cf-bd61-b4c92e849900","Type":"ContainerStarted","Data":"49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753"} Dec 04 15:56:47 crc kubenswrapper[4676]: I1204 15:56:47.014964 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w2l8g" podStartSLOduration=2.609843517 podStartE2EDuration="5.014922982s" podCreationTimestamp="2025-12-04 15:56:42 +0000 UTC" firstStartedPulling="2025-12-04 15:56:43.928549461 +0000 UTC m=+2211.363219318" lastFinishedPulling="2025-12-04 15:56:46.333628926 +0000 UTC m=+2213.768298783" observedRunningTime="2025-12-04 15:56:47.012370009 +0000 UTC m=+2214.447039866" watchObservedRunningTime="2025-12-04 15:56:47.014922982 +0000 UTC m=+2214.449592839" Dec 04 15:56:52 crc kubenswrapper[4676]: I1204 15:56:52.684052 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:52 crc kubenswrapper[4676]: I1204 15:56:52.684659 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:52 crc kubenswrapper[4676]: I1204 15:56:52.775115 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:53 crc kubenswrapper[4676]: I1204 15:56:53.082312 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:53 crc kubenswrapper[4676]: I1204 15:56:53.149288 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2l8g"] Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.051436 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w2l8g" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="registry-server" containerID="cri-o://49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753" gracePeriod=2 Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.535414 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.649596 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-utilities\") pod \"478d1294-182e-42cf-bd61-b4c92e849900\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.649792 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-catalog-content\") pod \"478d1294-182e-42cf-bd61-b4c92e849900\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.649834 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt6rf\" (UniqueName: \"kubernetes.io/projected/478d1294-182e-42cf-bd61-b4c92e849900-kube-api-access-rt6rf\") pod \"478d1294-182e-42cf-bd61-b4c92e849900\" (UID: \"478d1294-182e-42cf-bd61-b4c92e849900\") " Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.651569 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-utilities" (OuterVolumeSpecName: "utilities") pod "478d1294-182e-42cf-bd61-b4c92e849900" (UID: "478d1294-182e-42cf-bd61-b4c92e849900"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.656181 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478d1294-182e-42cf-bd61-b4c92e849900-kube-api-access-rt6rf" (OuterVolumeSpecName: "kube-api-access-rt6rf") pod "478d1294-182e-42cf-bd61-b4c92e849900" (UID: "478d1294-182e-42cf-bd61-b4c92e849900"). InnerVolumeSpecName "kube-api-access-rt6rf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.714899 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "478d1294-182e-42cf-bd61-b4c92e849900" (UID: "478d1294-182e-42cf-bd61-b4c92e849900"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.752081 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.752119 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt6rf\" (UniqueName: \"kubernetes.io/projected/478d1294-182e-42cf-bd61-b4c92e849900-kube-api-access-rt6rf\") on node \"crc\" DevicePath \"\"" Dec 04 15:56:55 crc kubenswrapper[4676]: I1204 15:56:55.752134 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/478d1294-182e-42cf-bd61-b4c92e849900-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.063037 4676 generic.go:334] "Generic (PLEG): container finished" podID="478d1294-182e-42cf-bd61-b4c92e849900" containerID="49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753" exitCode=0 Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.063090 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2l8g" event={"ID":"478d1294-182e-42cf-bd61-b4c92e849900","Type":"ContainerDied","Data":"49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753"} Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.063102 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2l8g" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.063120 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2l8g" event={"ID":"478d1294-182e-42cf-bd61-b4c92e849900","Type":"ContainerDied","Data":"e1ae3bd0c3f6af4d0779bb829419fdc208c81177d02904b569d1aa8bccc60e7a"} Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.063139 4676 scope.go:117] "RemoveContainer" containerID="49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.107878 4676 scope.go:117] "RemoveContainer" containerID="802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.113986 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2l8g"] Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.124076 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w2l8g"] Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.134425 4676 scope.go:117] "RemoveContainer" containerID="a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.176249 4676 scope.go:117] "RemoveContainer" containerID="49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753" Dec 04 15:56:56 crc kubenswrapper[4676]: E1204 15:56:56.176798 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753\": container with ID starting with 49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753 not found: ID does not exist" containerID="49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.176944 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753"} err="failed to get container status \"49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753\": rpc error: code = NotFound desc = could not find container \"49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753\": container with ID starting with 49b7b372c5b5aca9076730030d8900368c267cac8a382b1962e15b7cca6f9753 not found: ID does not exist" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.177087 4676 scope.go:117] "RemoveContainer" containerID="802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585" Dec 04 15:56:56 crc kubenswrapper[4676]: E1204 15:56:56.177488 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585\": container with ID starting with 802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585 not found: ID does not exist" containerID="802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.177609 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585"} err="failed to get container status \"802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585\": rpc error: code = NotFound desc = could not find container \"802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585\": container with ID starting with 802d6a6a86dd6eb962987d7049f7bcc9634823b8157722c635f7d9fa650db585 not found: ID does not exist" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.177725 4676 scope.go:117] "RemoveContainer" containerID="a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73" Dec 04 15:56:56 crc kubenswrapper[4676]: E1204 15:56:56.178036 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73\": container with ID starting with a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73 not found: ID does not exist" containerID="a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73" Dec 04 15:56:56 crc kubenswrapper[4676]: I1204 15:56:56.178152 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73"} err="failed to get container status \"a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73\": rpc error: code = NotFound desc = could not find container \"a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73\": container with ID starting with a1c5f4ce3e00a7587b9c1b5f51537422b82240f3736d7f35424194e40577db73 not found: ID does not exist" Dec 04 15:56:57 crc kubenswrapper[4676]: I1204 15:56:57.396720 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478d1294-182e-42cf-bd61-b4c92e849900" path="/var/lib/kubelet/pods/478d1294-182e-42cf-bd61-b4c92e849900/volumes" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.427893 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tbrkc"] Dec 04 15:56:58 crc kubenswrapper[4676]: E1204 15:56:58.428509 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="registry-server" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.428540 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="registry-server" Dec 04 15:56:58 crc kubenswrapper[4676]: E1204 15:56:58.428563 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="extract-utilities" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.428571 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="extract-utilities" Dec 04 15:56:58 crc kubenswrapper[4676]: E1204 15:56:58.428586 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="extract-content" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.428594 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="extract-content" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.428899 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="478d1294-182e-42cf-bd61-b4c92e849900" containerName="registry-server" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.430836 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.438304 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbrkc"] Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.607328 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-catalog-content\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.607387 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-utilities\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.607546 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fsj2\" (UniqueName: \"kubernetes.io/projected/20e023de-75e9-43bf-a3ed-f4824fbd3524-kube-api-access-2fsj2\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.710359 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fsj2\" (UniqueName: \"kubernetes.io/projected/20e023de-75e9-43bf-a3ed-f4824fbd3524-kube-api-access-2fsj2\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.710655 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-catalog-content\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.710691 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-utilities\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.711293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-catalog-content\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.711332 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-utilities\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.732892 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fsj2\" (UniqueName: \"kubernetes.io/projected/20e023de-75e9-43bf-a3ed-f4824fbd3524-kube-api-access-2fsj2\") pod \"community-operators-tbrkc\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:58 crc kubenswrapper[4676]: I1204 15:56:58.769012 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:56:59 crc kubenswrapper[4676]: I1204 15:56:59.320362 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tbrkc"] Dec 04 15:56:59 crc kubenswrapper[4676]: W1204 15:56:59.337151 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e023de_75e9_43bf_a3ed_f4824fbd3524.slice/crio-4bc48c08f7cc6572f6cc97ad34a1bcffb91f4276e16b58e0b9b978995e45b2d7 WatchSource:0}: Error finding container 4bc48c08f7cc6572f6cc97ad34a1bcffb91f4276e16b58e0b9b978995e45b2d7: Status 404 returned error can't find the container with id 4bc48c08f7cc6572f6cc97ad34a1bcffb91f4276e16b58e0b9b978995e45b2d7 Dec 04 15:56:59 crc kubenswrapper[4676]: I1204 15:56:59.385284 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:56:59 crc kubenswrapper[4676]: E1204 15:56:59.385609 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:57:00 crc kubenswrapper[4676]: I1204 15:57:00.104519 4676 generic.go:334] "Generic (PLEG): container finished" podID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerID="1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262" exitCode=0 Dec 04 15:57:00 crc kubenswrapper[4676]: I1204 15:57:00.104628 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbrkc" event={"ID":"20e023de-75e9-43bf-a3ed-f4824fbd3524","Type":"ContainerDied","Data":"1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262"} Dec 04 15:57:00 crc kubenswrapper[4676]: I1204 15:57:00.104920 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbrkc" event={"ID":"20e023de-75e9-43bf-a3ed-f4824fbd3524","Type":"ContainerStarted","Data":"4bc48c08f7cc6572f6cc97ad34a1bcffb91f4276e16b58e0b9b978995e45b2d7"} Dec 04 15:57:03 crc kubenswrapper[4676]: I1204 15:57:03.139067 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbrkc" event={"ID":"20e023de-75e9-43bf-a3ed-f4824fbd3524","Type":"ContainerStarted","Data":"d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09"} Dec 04 15:57:04 crc kubenswrapper[4676]: I1204 15:57:04.155769 4676 generic.go:334] "Generic (PLEG): container finished" podID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerID="d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09" exitCode=0 Dec 04 15:57:04 crc kubenswrapper[4676]: I1204 15:57:04.155959 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbrkc" event={"ID":"20e023de-75e9-43bf-a3ed-f4824fbd3524","Type":"ContainerDied","Data":"d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09"} Dec 04 15:57:07 crc kubenswrapper[4676]: I1204 15:57:07.195040 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbrkc" event={"ID":"20e023de-75e9-43bf-a3ed-f4824fbd3524","Type":"ContainerStarted","Data":"7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7"} Dec 04 15:57:07 crc kubenswrapper[4676]: I1204 15:57:07.225128 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tbrkc" podStartSLOduration=2.510697954 podStartE2EDuration="9.225110144s" podCreationTimestamp="2025-12-04 15:56:58 +0000 UTC" firstStartedPulling="2025-12-04 15:57:00.106751769 +0000 UTC m=+2227.541421626" lastFinishedPulling="2025-12-04 15:57:06.821163969 +0000 UTC m=+2234.255833816" observedRunningTime="2025-12-04 15:57:07.215456336 +0000 UTC m=+2234.650126193" watchObservedRunningTime="2025-12-04 15:57:07.225110144 +0000 UTC m=+2234.659780001" Dec 04 15:57:08 crc kubenswrapper[4676]: I1204 15:57:08.772203 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:57:08 crc kubenswrapper[4676]: I1204 15:57:08.772555 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:57:08 crc kubenswrapper[4676]: I1204 15:57:08.821080 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:57:11 crc kubenswrapper[4676]: I1204 15:57:11.232138 4676 generic.go:334] "Generic (PLEG): container finished" podID="a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" containerID="a2615a72dcd9d4ccf01196ec43efa0a4c6ad4b4843628ad459a8852bc7c4c6d5" exitCode=0 Dec 04 15:57:11 crc kubenswrapper[4676]: I1204 15:57:11.232221 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" event={"ID":"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041","Type":"ContainerDied","Data":"a2615a72dcd9d4ccf01196ec43efa0a4c6ad4b4843628ad459a8852bc7c4c6d5"} Dec 04 15:57:11 crc kubenswrapper[4676]: I1204 15:57:11.385670 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:57:11 crc kubenswrapper[4676]: E1204 15:57:11.386103 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.675205 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.803274 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ssh-key\") pod \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.803623 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovn-combined-ca-bundle\") pod \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.803918 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x78jp\" (UniqueName: \"kubernetes.io/projected/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-kube-api-access-x78jp\") pod \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.804031 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovncontroller-config-0\") pod \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.804061 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-inventory\") pod \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\" (UID: \"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041\") " Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.809341 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" (UID: "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.811332 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-kube-api-access-x78jp" (OuterVolumeSpecName: "kube-api-access-x78jp") pod "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" (UID: "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041"). InnerVolumeSpecName "kube-api-access-x78jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.833772 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" (UID: "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.834023 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" (UID: "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.846260 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-inventory" (OuterVolumeSpecName: "inventory") pod "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" (UID: "a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.906041 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x78jp\" (UniqueName: \"kubernetes.io/projected/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-kube-api-access-x78jp\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.906075 4676 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.906085 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.906095 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:12 crc kubenswrapper[4676]: I1204 15:57:12.906104 4676 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.256224 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" event={"ID":"a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041","Type":"ContainerDied","Data":"50bd8393519ea2078a578eab64f5c3e9e005c482cedcaae176cf5561a2c1ba66"} Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.256284 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50bd8393519ea2078a578eab64f5c3e9e005c482cedcaae176cf5561a2c1ba66" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.256391 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-thh6s" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.356021 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct"] Dec 04 15:57:13 crc kubenswrapper[4676]: E1204 15:57:13.356474 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.356493 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.356712 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.357486 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.359525 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.359724 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.359952 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.360127 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.362390 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.363377 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.371607 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct"] Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.417599 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js5bl\" (UniqueName: \"kubernetes.io/projected/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-kube-api-access-js5bl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.417743 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.417933 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.418189 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.418319 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.418393 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.520772 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.521245 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.521414 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.521643 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js5bl\" (UniqueName: \"kubernetes.io/projected/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-kube-api-access-js5bl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.521788 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.522060 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.525723 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.525831 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.526646 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.526964 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.528044 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.540225 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js5bl\" (UniqueName: \"kubernetes.io/projected/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-kube-api-access-js5bl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:13 crc kubenswrapper[4676]: I1204 15:57:13.683600 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:57:14 crc kubenswrapper[4676]: I1204 15:57:14.243744 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct"] Dec 04 15:57:14 crc kubenswrapper[4676]: I1204 15:57:14.271949 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" event={"ID":"9ecf8093-2284-4bcf-adb4-c2880f87b7e9","Type":"ContainerStarted","Data":"ada4423a12293b0172c1aa0375f6429346c03e9c6bc45a015ab1ad364c6a2f6a"} Dec 04 15:57:15 crc kubenswrapper[4676]: I1204 15:57:15.284329 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" event={"ID":"9ecf8093-2284-4bcf-adb4-c2880f87b7e9","Type":"ContainerStarted","Data":"5dec3841524259031c3f5569f5aa0edfd655d9c31380056988f3ecd2fb09b78f"} Dec 04 15:57:15 crc kubenswrapper[4676]: I1204 15:57:15.306354 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" podStartSLOduration=1.846776283 podStartE2EDuration="2.306338589s" podCreationTimestamp="2025-12-04 15:57:13 +0000 UTC" firstStartedPulling="2025-12-04 15:57:14.246844348 +0000 UTC m=+2241.681514205" lastFinishedPulling="2025-12-04 15:57:14.706406664 +0000 UTC m=+2242.141076511" observedRunningTime="2025-12-04 15:57:15.303154858 +0000 UTC m=+2242.737824725" watchObservedRunningTime="2025-12-04 15:57:15.306338589 +0000 UTC m=+2242.741008446" Dec 04 15:57:18 crc kubenswrapper[4676]: I1204 15:57:18.823977 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:57:18 crc kubenswrapper[4676]: I1204 15:57:18.876626 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbrkc"] Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.334669 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tbrkc" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="registry-server" containerID="cri-o://7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7" gracePeriod=2 Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.789308 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.866485 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fsj2\" (UniqueName: \"kubernetes.io/projected/20e023de-75e9-43bf-a3ed-f4824fbd3524-kube-api-access-2fsj2\") pod \"20e023de-75e9-43bf-a3ed-f4824fbd3524\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.866620 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-utilities\") pod \"20e023de-75e9-43bf-a3ed-f4824fbd3524\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.866704 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-catalog-content\") pod \"20e023de-75e9-43bf-a3ed-f4824fbd3524\" (UID: \"20e023de-75e9-43bf-a3ed-f4824fbd3524\") " Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.867889 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-utilities" (OuterVolumeSpecName: "utilities") pod "20e023de-75e9-43bf-a3ed-f4824fbd3524" (UID: "20e023de-75e9-43bf-a3ed-f4824fbd3524"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.872979 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20e023de-75e9-43bf-a3ed-f4824fbd3524-kube-api-access-2fsj2" (OuterVolumeSpecName: "kube-api-access-2fsj2") pod "20e023de-75e9-43bf-a3ed-f4824fbd3524" (UID: "20e023de-75e9-43bf-a3ed-f4824fbd3524"). InnerVolumeSpecName "kube-api-access-2fsj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.914685 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20e023de-75e9-43bf-a3ed-f4824fbd3524" (UID: "20e023de-75e9-43bf-a3ed-f4824fbd3524"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.970708 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fsj2\" (UniqueName: \"kubernetes.io/projected/20e023de-75e9-43bf-a3ed-f4824fbd3524-kube-api-access-2fsj2\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.971108 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:19 crc kubenswrapper[4676]: I1204 15:57:19.971122 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20e023de-75e9-43bf-a3ed-f4824fbd3524-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.347541 4676 generic.go:334] "Generic (PLEG): container finished" podID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerID="7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7" exitCode=0 Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.347611 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbrkc" event={"ID":"20e023de-75e9-43bf-a3ed-f4824fbd3524","Type":"ContainerDied","Data":"7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7"} Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.347628 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tbrkc" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.347659 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tbrkc" event={"ID":"20e023de-75e9-43bf-a3ed-f4824fbd3524","Type":"ContainerDied","Data":"4bc48c08f7cc6572f6cc97ad34a1bcffb91f4276e16b58e0b9b978995e45b2d7"} Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.347682 4676 scope.go:117] "RemoveContainer" containerID="7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.371134 4676 scope.go:117] "RemoveContainer" containerID="d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.394181 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tbrkc"] Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.401031 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tbrkc"] Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.418708 4676 scope.go:117] "RemoveContainer" containerID="1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.465629 4676 scope.go:117] "RemoveContainer" containerID="7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7" Dec 04 15:57:20 crc kubenswrapper[4676]: E1204 15:57:20.466261 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7\": container with ID starting with 7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7 not found: ID does not exist" containerID="7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.466295 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7"} err="failed to get container status \"7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7\": rpc error: code = NotFound desc = could not find container \"7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7\": container with ID starting with 7530d5ff80323e4399cedf30ce12dd6b8a971a2f1f9dd387358f385d5d6015d7 not found: ID does not exist" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.466323 4676 scope.go:117] "RemoveContainer" containerID="d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09" Dec 04 15:57:20 crc kubenswrapper[4676]: E1204 15:57:20.466779 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09\": container with ID starting with d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09 not found: ID does not exist" containerID="d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.466803 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09"} err="failed to get container status \"d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09\": rpc error: code = NotFound desc = could not find container \"d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09\": container with ID starting with d283761ca0209da4d22bcd56a3da97a7988f0b7f9f672e07fa78996c0ffd9a09 not found: ID does not exist" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.466823 4676 scope.go:117] "RemoveContainer" containerID="1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262" Dec 04 15:57:20 crc kubenswrapper[4676]: E1204 15:57:20.467314 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262\": container with ID starting with 1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262 not found: ID does not exist" containerID="1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262" Dec 04 15:57:20 crc kubenswrapper[4676]: I1204 15:57:20.467363 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262"} err="failed to get container status \"1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262\": rpc error: code = NotFound desc = could not find container \"1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262\": container with ID starting with 1bd72fbb988ac8fe3253ead6e1ebec2c2e83c30843a9cfbf901a63e0382a2262 not found: ID does not exist" Dec 04 15:57:21 crc kubenswrapper[4676]: I1204 15:57:21.398435 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" path="/var/lib/kubelet/pods/20e023de-75e9-43bf-a3ed-f4824fbd3524/volumes" Dec 04 15:57:24 crc kubenswrapper[4676]: I1204 15:57:24.385870 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:57:24 crc kubenswrapper[4676]: E1204 15:57:24.386304 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:57:38 crc kubenswrapper[4676]: I1204 15:57:38.384482 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:57:38 crc kubenswrapper[4676]: E1204 15:57:38.385439 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:57:52 crc kubenswrapper[4676]: I1204 15:57:52.385235 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:57:52 crc kubenswrapper[4676]: E1204 15:57:52.386063 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:58:05 crc kubenswrapper[4676]: I1204 15:58:05.384695 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:58:05 crc kubenswrapper[4676]: E1204 15:58:05.385551 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:58:09 crc kubenswrapper[4676]: I1204 15:58:09.091663 4676 generic.go:334] "Generic (PLEG): container finished" podID="9ecf8093-2284-4bcf-adb4-c2880f87b7e9" containerID="5dec3841524259031c3f5569f5aa0edfd655d9c31380056988f3ecd2fb09b78f" exitCode=0 Dec 04 15:58:09 crc kubenswrapper[4676]: I1204 15:58:09.091781 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" event={"ID":"9ecf8093-2284-4bcf-adb4-c2880f87b7e9","Type":"ContainerDied","Data":"5dec3841524259031c3f5569f5aa0edfd655d9c31380056988f3ecd2fb09b78f"} Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.532664 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.709002 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-nova-metadata-neutron-config-0\") pod \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.709060 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js5bl\" (UniqueName: \"kubernetes.io/projected/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-kube-api-access-js5bl\") pod \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.709135 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-inventory\") pod \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.709206 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.709268 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-metadata-combined-ca-bundle\") pod \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.709316 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-ssh-key\") pod \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\" (UID: \"9ecf8093-2284-4bcf-adb4-c2880f87b7e9\") " Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.716076 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9ecf8093-2284-4bcf-adb4-c2880f87b7e9" (UID: "9ecf8093-2284-4bcf-adb4-c2880f87b7e9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.716332 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-kube-api-access-js5bl" (OuterVolumeSpecName: "kube-api-access-js5bl") pod "9ecf8093-2284-4bcf-adb4-c2880f87b7e9" (UID: "9ecf8093-2284-4bcf-adb4-c2880f87b7e9"). InnerVolumeSpecName "kube-api-access-js5bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.738933 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9ecf8093-2284-4bcf-adb4-c2880f87b7e9" (UID: "9ecf8093-2284-4bcf-adb4-c2880f87b7e9"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.740343 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9ecf8093-2284-4bcf-adb4-c2880f87b7e9" (UID: "9ecf8093-2284-4bcf-adb4-c2880f87b7e9"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.742168 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ecf8093-2284-4bcf-adb4-c2880f87b7e9" (UID: "9ecf8093-2284-4bcf-adb4-c2880f87b7e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.745030 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-inventory" (OuterVolumeSpecName: "inventory") pod "9ecf8093-2284-4bcf-adb4-c2880f87b7e9" (UID: "9ecf8093-2284-4bcf-adb4-c2880f87b7e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.812028 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.812066 4676 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.812079 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js5bl\" (UniqueName: \"kubernetes.io/projected/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-kube-api-access-js5bl\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.812093 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.812104 4676 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:10 crc kubenswrapper[4676]: I1204 15:58:10.812116 4676 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ecf8093-2284-4bcf-adb4-c2880f87b7e9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.119298 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" event={"ID":"9ecf8093-2284-4bcf-adb4-c2880f87b7e9","Type":"ContainerDied","Data":"ada4423a12293b0172c1aa0375f6429346c03e9c6bc45a015ab1ad364c6a2f6a"} Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.119353 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ada4423a12293b0172c1aa0375f6429346c03e9c6bc45a015ab1ad364c6a2f6a" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.119372 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.212564 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp"] Dec 04 15:58:11 crc kubenswrapper[4676]: E1204 15:58:11.213301 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="extract-content" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.213342 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="extract-content" Dec 04 15:58:11 crc kubenswrapper[4676]: E1204 15:58:11.213393 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ecf8093-2284-4bcf-adb4-c2880f87b7e9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.213405 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ecf8093-2284-4bcf-adb4-c2880f87b7e9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 15:58:11 crc kubenswrapper[4676]: E1204 15:58:11.213419 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="extract-utilities" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.213428 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="extract-utilities" Dec 04 15:58:11 crc kubenswrapper[4676]: E1204 15:58:11.213443 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="registry-server" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.213452 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="registry-server" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.213737 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="20e023de-75e9-43bf-a3ed-f4824fbd3524" containerName="registry-server" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.213760 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ecf8093-2284-4bcf-adb4-c2880f87b7e9" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.214695 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.220850 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.221395 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.221517 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.221644 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.222029 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.226761 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp"] Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.321766 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.321875 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8dfb\" (UniqueName: \"kubernetes.io/projected/9724a435-38f2-4384-b3fe-d5229301866d-kube-api-access-x8dfb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.322010 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.322270 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.322473 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.424827 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.424896 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.424969 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8dfb\" (UniqueName: \"kubernetes.io/projected/9724a435-38f2-4384-b3fe-d5229301866d-kube-api-access-x8dfb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.425008 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.425081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.429791 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.429895 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.430362 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.436673 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.440970 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8dfb\" (UniqueName: \"kubernetes.io/projected/9724a435-38f2-4384-b3fe-d5229301866d-kube-api-access-x8dfb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:11 crc kubenswrapper[4676]: I1204 15:58:11.549510 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 15:58:12 crc kubenswrapper[4676]: I1204 15:58:12.098050 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp"] Dec 04 15:58:12 crc kubenswrapper[4676]: I1204 15:58:12.132673 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" event={"ID":"9724a435-38f2-4384-b3fe-d5229301866d","Type":"ContainerStarted","Data":"61341f7f038293e0a95ec3f006d33bd007f521a56add547dd1608edff68fdea3"} Dec 04 15:58:14 crc kubenswrapper[4676]: I1204 15:58:14.158541 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" event={"ID":"9724a435-38f2-4384-b3fe-d5229301866d","Type":"ContainerStarted","Data":"095e5c6fb84430b8c643bdae80572db1358e3a48980d7352482b3930c19c18d5"} Dec 04 15:58:14 crc kubenswrapper[4676]: I1204 15:58:14.187520 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" podStartSLOduration=1.7054122440000001 podStartE2EDuration="3.187481865s" podCreationTimestamp="2025-12-04 15:58:11 +0000 UTC" firstStartedPulling="2025-12-04 15:58:12.104792939 +0000 UTC m=+2299.539462796" lastFinishedPulling="2025-12-04 15:58:13.58686255 +0000 UTC m=+2301.021532417" observedRunningTime="2025-12-04 15:58:14.176844759 +0000 UTC m=+2301.611514616" watchObservedRunningTime="2025-12-04 15:58:14.187481865 +0000 UTC m=+2301.622151722" Dec 04 15:58:18 crc kubenswrapper[4676]: I1204 15:58:18.385550 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:58:18 crc kubenswrapper[4676]: E1204 15:58:18.386565 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.526387 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g74zj"] Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.529674 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.543531 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g74zj"] Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.611416 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks8g5\" (UniqueName: \"kubernetes.io/projected/2c043d90-4867-473e-95fb-7f42a547ee07-kube-api-access-ks8g5\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.611510 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-utilities\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.611594 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-catalog-content\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.713541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks8g5\" (UniqueName: \"kubernetes.io/projected/2c043d90-4867-473e-95fb-7f42a547ee07-kube-api-access-ks8g5\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.713628 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-utilities\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.713733 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-catalog-content\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.714421 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-catalog-content\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.715133 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-utilities\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.738391 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks8g5\" (UniqueName: \"kubernetes.io/projected/2c043d90-4867-473e-95fb-7f42a547ee07-kube-api-access-ks8g5\") pod \"redhat-marketplace-g74zj\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:24 crc kubenswrapper[4676]: I1204 15:58:24.851753 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:25 crc kubenswrapper[4676]: I1204 15:58:25.332626 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g74zj"] Dec 04 15:58:26 crc kubenswrapper[4676]: I1204 15:58:26.043275 4676 generic.go:334] "Generic (PLEG): container finished" podID="2c043d90-4867-473e-95fb-7f42a547ee07" containerID="978e343e4c261be4570b2f3a4da566301e57718210b614ecc9d38107ecef3980" exitCode=0 Dec 04 15:58:26 crc kubenswrapper[4676]: I1204 15:58:26.043355 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g74zj" event={"ID":"2c043d90-4867-473e-95fb-7f42a547ee07","Type":"ContainerDied","Data":"978e343e4c261be4570b2f3a4da566301e57718210b614ecc9d38107ecef3980"} Dec 04 15:58:26 crc kubenswrapper[4676]: I1204 15:58:26.043424 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g74zj" event={"ID":"2c043d90-4867-473e-95fb-7f42a547ee07","Type":"ContainerStarted","Data":"25675c27940813ba0853bbc670563dd3637778e6a86368a2cce2ccc246dd5a3f"} Dec 04 15:58:27 crc kubenswrapper[4676]: I1204 15:58:27.055721 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g74zj" event={"ID":"2c043d90-4867-473e-95fb-7f42a547ee07","Type":"ContainerStarted","Data":"a288cca687dcc0455a219129b1685a6ad614f4442be8829b173770ae5263d3c0"} Dec 04 15:58:28 crc kubenswrapper[4676]: I1204 15:58:28.070376 4676 generic.go:334] "Generic (PLEG): container finished" podID="2c043d90-4867-473e-95fb-7f42a547ee07" containerID="a288cca687dcc0455a219129b1685a6ad614f4442be8829b173770ae5263d3c0" exitCode=0 Dec 04 15:58:28 crc kubenswrapper[4676]: I1204 15:58:28.070438 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g74zj" event={"ID":"2c043d90-4867-473e-95fb-7f42a547ee07","Type":"ContainerDied","Data":"a288cca687dcc0455a219129b1685a6ad614f4442be8829b173770ae5263d3c0"} Dec 04 15:58:29 crc kubenswrapper[4676]: I1204 15:58:29.087827 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g74zj" event={"ID":"2c043d90-4867-473e-95fb-7f42a547ee07","Type":"ContainerStarted","Data":"127dc79386d4ec962161a76e637e955814689e31c922f2db9b011b7aee9f187f"} Dec 04 15:58:29 crc kubenswrapper[4676]: I1204 15:58:29.127269 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g74zj" podStartSLOduration=2.331023396 podStartE2EDuration="5.127248807s" podCreationTimestamp="2025-12-04 15:58:24 +0000 UTC" firstStartedPulling="2025-12-04 15:58:26.046502458 +0000 UTC m=+2313.481172315" lastFinishedPulling="2025-12-04 15:58:28.842727869 +0000 UTC m=+2316.277397726" observedRunningTime="2025-12-04 15:58:29.118495255 +0000 UTC m=+2316.553165132" watchObservedRunningTime="2025-12-04 15:58:29.127248807 +0000 UTC m=+2316.561918664" Dec 04 15:58:32 crc kubenswrapper[4676]: I1204 15:58:32.384595 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:58:32 crc kubenswrapper[4676]: E1204 15:58:32.385302 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:58:34 crc kubenswrapper[4676]: I1204 15:58:34.853337 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:34 crc kubenswrapper[4676]: I1204 15:58:34.853715 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:34 crc kubenswrapper[4676]: I1204 15:58:34.920025 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:35 crc kubenswrapper[4676]: I1204 15:58:35.196889 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:35 crc kubenswrapper[4676]: I1204 15:58:35.259501 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g74zj"] Dec 04 15:58:37 crc kubenswrapper[4676]: I1204 15:58:37.168153 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g74zj" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="registry-server" containerID="cri-o://127dc79386d4ec962161a76e637e955814689e31c922f2db9b011b7aee9f187f" gracePeriod=2 Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.192947 4676 generic.go:334] "Generic (PLEG): container finished" podID="2c043d90-4867-473e-95fb-7f42a547ee07" containerID="127dc79386d4ec962161a76e637e955814689e31c922f2db9b011b7aee9f187f" exitCode=0 Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.193041 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g74zj" event={"ID":"2c043d90-4867-473e-95fb-7f42a547ee07","Type":"ContainerDied","Data":"127dc79386d4ec962161a76e637e955814689e31c922f2db9b011b7aee9f187f"} Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.777281 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.914503 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks8g5\" (UniqueName: \"kubernetes.io/projected/2c043d90-4867-473e-95fb-7f42a547ee07-kube-api-access-ks8g5\") pod \"2c043d90-4867-473e-95fb-7f42a547ee07\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.914602 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-utilities\") pod \"2c043d90-4867-473e-95fb-7f42a547ee07\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.914690 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-catalog-content\") pod \"2c043d90-4867-473e-95fb-7f42a547ee07\" (UID: \"2c043d90-4867-473e-95fb-7f42a547ee07\") " Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.915651 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-utilities" (OuterVolumeSpecName: "utilities") pod "2c043d90-4867-473e-95fb-7f42a547ee07" (UID: "2c043d90-4867-473e-95fb-7f42a547ee07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.916304 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.920727 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c043d90-4867-473e-95fb-7f42a547ee07-kube-api-access-ks8g5" (OuterVolumeSpecName: "kube-api-access-ks8g5") pod "2c043d90-4867-473e-95fb-7f42a547ee07" (UID: "2c043d90-4867-473e-95fb-7f42a547ee07"). InnerVolumeSpecName "kube-api-access-ks8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 15:58:39 crc kubenswrapper[4676]: I1204 15:58:39.934789 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c043d90-4867-473e-95fb-7f42a547ee07" (UID: "2c043d90-4867-473e-95fb-7f42a547ee07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.018046 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ks8g5\" (UniqueName: \"kubernetes.io/projected/2c043d90-4867-473e-95fb-7f42a547ee07-kube-api-access-ks8g5\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.018388 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c043d90-4867-473e-95fb-7f42a547ee07-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.206668 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g74zj" event={"ID":"2c043d90-4867-473e-95fb-7f42a547ee07","Type":"ContainerDied","Data":"25675c27940813ba0853bbc670563dd3637778e6a86368a2cce2ccc246dd5a3f"} Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.206729 4676 scope.go:117] "RemoveContainer" containerID="127dc79386d4ec962161a76e637e955814689e31c922f2db9b011b7aee9f187f" Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.206728 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g74zj" Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.242254 4676 scope.go:117] "RemoveContainer" containerID="a288cca687dcc0455a219129b1685a6ad614f4442be8829b173770ae5263d3c0" Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.246375 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g74zj"] Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.256225 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g74zj"] Dec 04 15:58:40 crc kubenswrapper[4676]: I1204 15:58:40.268963 4676 scope.go:117] "RemoveContainer" containerID="978e343e4c261be4570b2f3a4da566301e57718210b614ecc9d38107ecef3980" Dec 04 15:58:41 crc kubenswrapper[4676]: I1204 15:58:41.397736 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" path="/var/lib/kubelet/pods/2c043d90-4867-473e-95fb-7f42a547ee07/volumes" Dec 04 15:58:44 crc kubenswrapper[4676]: I1204 15:58:44.385295 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:58:44 crc kubenswrapper[4676]: E1204 15:58:44.386090 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:58:55 crc kubenswrapper[4676]: I1204 15:58:55.384699 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:58:55 crc kubenswrapper[4676]: E1204 15:58:55.385845 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:59:06 crc kubenswrapper[4676]: I1204 15:59:06.385268 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:59:06 crc kubenswrapper[4676]: E1204 15:59:06.386099 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:59:17 crc kubenswrapper[4676]: I1204 15:59:17.384895 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:59:17 crc kubenswrapper[4676]: E1204 15:59:17.385705 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:59:30 crc kubenswrapper[4676]: I1204 15:59:30.385003 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:59:30 crc kubenswrapper[4676]: E1204 15:59:30.385790 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:59:43 crc kubenswrapper[4676]: I1204 15:59:43.525723 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:59:43 crc kubenswrapper[4676]: E1204 15:59:43.526605 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 15:59:58 crc kubenswrapper[4676]: I1204 15:59:58.384914 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 15:59:58 crc kubenswrapper[4676]: E1204 15:59:58.385782 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.212540 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h"] Dec 04 16:00:00 crc kubenswrapper[4676]: E1204 16:00:00.215243 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="extract-utilities" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.215300 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="extract-utilities" Dec 04 16:00:00 crc kubenswrapper[4676]: E1204 16:00:00.215352 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="registry-server" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.215363 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="registry-server" Dec 04 16:00:00 crc kubenswrapper[4676]: E1204 16:00:00.215380 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="extract-content" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.215392 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="extract-content" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.215758 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c043d90-4867-473e-95fb-7f42a547ee07" containerName="registry-server" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.216896 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.222451 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.222840 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.243662 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h"] Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.258744 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8s2n\" (UniqueName: \"kubernetes.io/projected/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-kube-api-access-h8s2n\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.258829 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-secret-volume\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.259228 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-config-volume\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.361836 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-config-volume\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.361982 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8s2n\" (UniqueName: \"kubernetes.io/projected/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-kube-api-access-h8s2n\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.362025 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-secret-volume\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.363005 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-config-volume\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.371721 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-secret-volume\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.379590 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8s2n\" (UniqueName: \"kubernetes.io/projected/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-kube-api-access-h8s2n\") pod \"collect-profiles-29414400-5pk7h\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:00 crc kubenswrapper[4676]: I1204 16:00:00.555261 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:01 crc kubenswrapper[4676]: I1204 16:00:01.064981 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h"] Dec 04 16:00:01 crc kubenswrapper[4676]: I1204 16:00:01.181420 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" event={"ID":"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af","Type":"ContainerStarted","Data":"c5ca6acd865e6f00df82fbbde964bab7308365a4790cad82c85ce33007987e11"} Dec 04 16:00:02 crc kubenswrapper[4676]: I1204 16:00:02.192319 4676 generic.go:334] "Generic (PLEG): container finished" podID="03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" containerID="9dda80fe98e231dfc7a509531a2d66e41ec11f3b9e9d18959523984a81530edf" exitCode=0 Dec 04 16:00:02 crc kubenswrapper[4676]: I1204 16:00:02.192972 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" event={"ID":"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af","Type":"ContainerDied","Data":"9dda80fe98e231dfc7a509531a2d66e41ec11f3b9e9d18959523984a81530edf"} Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.610138 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.702191 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-config-volume\") pod \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.702266 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8s2n\" (UniqueName: \"kubernetes.io/projected/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-kube-api-access-h8s2n\") pod \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.702576 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-secret-volume\") pod \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\" (UID: \"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af\") " Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.703044 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-config-volume" (OuterVolumeSpecName: "config-volume") pod "03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" (UID: "03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.708434 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" (UID: "03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.710296 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-kube-api-access-h8s2n" (OuterVolumeSpecName: "kube-api-access-h8s2n") pod "03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" (UID: "03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af"). InnerVolumeSpecName "kube-api-access-h8s2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.809969 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.810221 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:00:03 crc kubenswrapper[4676]: I1204 16:00:03.810311 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8s2n\" (UniqueName: \"kubernetes.io/projected/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af-kube-api-access-h8s2n\") on node \"crc\" DevicePath \"\"" Dec 04 16:00:04 crc kubenswrapper[4676]: I1204 16:00:04.247085 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" event={"ID":"03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af","Type":"ContainerDied","Data":"c5ca6acd865e6f00df82fbbde964bab7308365a4790cad82c85ce33007987e11"} Dec 04 16:00:04 crc kubenswrapper[4676]: I1204 16:00:04.247152 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5ca6acd865e6f00df82fbbde964bab7308365a4790cad82c85ce33007987e11" Dec 04 16:00:04 crc kubenswrapper[4676]: I1204 16:00:04.247258 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h" Dec 04 16:00:04 crc kubenswrapper[4676]: I1204 16:00:04.699822 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw"] Dec 04 16:00:04 crc kubenswrapper[4676]: I1204 16:00:04.712636 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414355-rpgmw"] Dec 04 16:00:05 crc kubenswrapper[4676]: I1204 16:00:05.401846 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa64ebc-2612-4a0c-833e-be450fbbd5d0" path="/var/lib/kubelet/pods/daa64ebc-2612-4a0c-833e-be450fbbd5d0/volumes" Dec 04 16:00:11 crc kubenswrapper[4676]: I1204 16:00:11.384786 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:00:11 crc kubenswrapper[4676]: E1204 16:00:11.385702 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:00:26 crc kubenswrapper[4676]: I1204 16:00:26.384753 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:00:26 crc kubenswrapper[4676]: E1204 16:00:26.385826 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:00:37 crc kubenswrapper[4676]: I1204 16:00:37.390302 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:00:37 crc kubenswrapper[4676]: E1204 16:00:37.391216 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:00:48 crc kubenswrapper[4676]: I1204 16:00:48.384464 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:00:48 crc kubenswrapper[4676]: E1204 16:00:48.385372 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.153623 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414401-6hznt"] Dec 04 16:01:00 crc kubenswrapper[4676]: E1204 16:01:00.155196 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" containerName="collect-profiles" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.155229 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" containerName="collect-profiles" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.155702 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" containerName="collect-profiles" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.157450 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.162878 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414401-6hznt"] Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.209291 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-config-data\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.209338 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9v6p\" (UniqueName: \"kubernetes.io/projected/8cd36f16-1d73-423c-918e-7e1e85929fb7-kube-api-access-t9v6p\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.209441 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-fernet-keys\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.209465 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-combined-ca-bundle\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.311994 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-config-data\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.313106 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9v6p\" (UniqueName: \"kubernetes.io/projected/8cd36f16-1d73-423c-918e-7e1e85929fb7-kube-api-access-t9v6p\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.313270 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-fernet-keys\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.313301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-combined-ca-bundle\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.319340 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-fernet-keys\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.319801 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-config-data\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.321512 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-combined-ca-bundle\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.331612 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9v6p\" (UniqueName: \"kubernetes.io/projected/8cd36f16-1d73-423c-918e-7e1e85929fb7-kube-api-access-t9v6p\") pod \"keystone-cron-29414401-6hznt\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.483232 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:00 crc kubenswrapper[4676]: I1204 16:01:00.957987 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414401-6hznt"] Dec 04 16:01:01 crc kubenswrapper[4676]: I1204 16:01:01.386615 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:01:01 crc kubenswrapper[4676]: E1204 16:01:01.387370 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:01:01 crc kubenswrapper[4676]: I1204 16:01:01.792201 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414401-6hznt" event={"ID":"8cd36f16-1d73-423c-918e-7e1e85929fb7","Type":"ContainerStarted","Data":"ca9af8bf6a9ff7092b09954747d350b3311c7b3be7e74039b1a67f830567252b"} Dec 04 16:01:01 crc kubenswrapper[4676]: I1204 16:01:01.792248 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414401-6hznt" event={"ID":"8cd36f16-1d73-423c-918e-7e1e85929fb7","Type":"ContainerStarted","Data":"8807961aa1faaf8969d863ed5acdcac06fe29cbae58daa412aaf1dbab0e99a5a"} Dec 04 16:01:01 crc kubenswrapper[4676]: I1204 16:01:01.820898 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414401-6hznt" podStartSLOduration=1.8208568330000001 podStartE2EDuration="1.820856833s" podCreationTimestamp="2025-12-04 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 16:01:01.80633176 +0000 UTC m=+2469.241001627" watchObservedRunningTime="2025-12-04 16:01:01.820856833 +0000 UTC m=+2469.255526710" Dec 04 16:01:01 crc kubenswrapper[4676]: I1204 16:01:01.872411 4676 scope.go:117] "RemoveContainer" containerID="68b7984aa978cfaa97649563029cb7f60581f4c8042338841f0cfde5163dad1a" Dec 04 16:01:06 crc kubenswrapper[4676]: I1204 16:01:06.852332 4676 generic.go:334] "Generic (PLEG): container finished" podID="8cd36f16-1d73-423c-918e-7e1e85929fb7" containerID="ca9af8bf6a9ff7092b09954747d350b3311c7b3be7e74039b1a67f830567252b" exitCode=0 Dec 04 16:01:06 crc kubenswrapper[4676]: I1204 16:01:06.852544 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414401-6hznt" event={"ID":"8cd36f16-1d73-423c-918e-7e1e85929fb7","Type":"ContainerDied","Data":"ca9af8bf6a9ff7092b09954747d350b3311c7b3be7e74039b1a67f830567252b"} Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.217312 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.307155 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-config-data\") pod \"8cd36f16-1d73-423c-918e-7e1e85929fb7\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.307191 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-combined-ca-bundle\") pod \"8cd36f16-1d73-423c-918e-7e1e85929fb7\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.307212 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-fernet-keys\") pod \"8cd36f16-1d73-423c-918e-7e1e85929fb7\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.307353 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9v6p\" (UniqueName: \"kubernetes.io/projected/8cd36f16-1d73-423c-918e-7e1e85929fb7-kube-api-access-t9v6p\") pod \"8cd36f16-1d73-423c-918e-7e1e85929fb7\" (UID: \"8cd36f16-1d73-423c-918e-7e1e85929fb7\") " Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.312379 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cd36f16-1d73-423c-918e-7e1e85929fb7-kube-api-access-t9v6p" (OuterVolumeSpecName: "kube-api-access-t9v6p") pod "8cd36f16-1d73-423c-918e-7e1e85929fb7" (UID: "8cd36f16-1d73-423c-918e-7e1e85929fb7"). InnerVolumeSpecName "kube-api-access-t9v6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.312731 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8cd36f16-1d73-423c-918e-7e1e85929fb7" (UID: "8cd36f16-1d73-423c-918e-7e1e85929fb7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.337710 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cd36f16-1d73-423c-918e-7e1e85929fb7" (UID: "8cd36f16-1d73-423c-918e-7e1e85929fb7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.362882 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-config-data" (OuterVolumeSpecName: "config-data") pod "8cd36f16-1d73-423c-918e-7e1e85929fb7" (UID: "8cd36f16-1d73-423c-918e-7e1e85929fb7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.408981 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.409014 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.409025 4676 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8cd36f16-1d73-423c-918e-7e1e85929fb7-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.409035 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9v6p\" (UniqueName: \"kubernetes.io/projected/8cd36f16-1d73-423c-918e-7e1e85929fb7-kube-api-access-t9v6p\") on node \"crc\" DevicePath \"\"" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.872418 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414401-6hznt" event={"ID":"8cd36f16-1d73-423c-918e-7e1e85929fb7","Type":"ContainerDied","Data":"8807961aa1faaf8969d863ed5acdcac06fe29cbae58daa412aaf1dbab0e99a5a"} Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.872468 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8807961aa1faaf8969d863ed5acdcac06fe29cbae58daa412aaf1dbab0e99a5a" Dec 04 16:01:08 crc kubenswrapper[4676]: I1204 16:01:08.872859 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414401-6hznt" Dec 04 16:01:15 crc kubenswrapper[4676]: I1204 16:01:15.385137 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:01:15 crc kubenswrapper[4676]: E1204 16:01:15.386045 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:01:30 crc kubenswrapper[4676]: I1204 16:01:30.388722 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:01:30 crc kubenswrapper[4676]: E1204 16:01:30.391249 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:01:44 crc kubenswrapper[4676]: I1204 16:01:44.384901 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:01:44 crc kubenswrapper[4676]: E1204 16:01:44.385662 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:01:56 crc kubenswrapper[4676]: I1204 16:01:56.384658 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:01:57 crc kubenswrapper[4676]: I1204 16:01:57.537244 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"23593ec121879c14847d74f1e6c298bf5947fc489e28bc53b3d892ba8fda12d6"} Dec 04 16:02:48 crc kubenswrapper[4676]: I1204 16:02:48.043780 4676 generic.go:334] "Generic (PLEG): container finished" podID="9724a435-38f2-4384-b3fe-d5229301866d" containerID="095e5c6fb84430b8c643bdae80572db1358e3a48980d7352482b3930c19c18d5" exitCode=0 Dec 04 16:02:48 crc kubenswrapper[4676]: I1204 16:02:48.043890 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" event={"ID":"9724a435-38f2-4384-b3fe-d5229301866d","Type":"ContainerDied","Data":"095e5c6fb84430b8c643bdae80572db1358e3a48980d7352482b3930c19c18d5"} Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.545820 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.748575 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-inventory\") pod \"9724a435-38f2-4384-b3fe-d5229301866d\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.748833 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-combined-ca-bundle\") pod \"9724a435-38f2-4384-b3fe-d5229301866d\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.748890 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8dfb\" (UniqueName: \"kubernetes.io/projected/9724a435-38f2-4384-b3fe-d5229301866d-kube-api-access-x8dfb\") pod \"9724a435-38f2-4384-b3fe-d5229301866d\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.748962 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-secret-0\") pod \"9724a435-38f2-4384-b3fe-d5229301866d\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.749029 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-ssh-key\") pod \"9724a435-38f2-4384-b3fe-d5229301866d\" (UID: \"9724a435-38f2-4384-b3fe-d5229301866d\") " Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.760450 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9724a435-38f2-4384-b3fe-d5229301866d-kube-api-access-x8dfb" (OuterVolumeSpecName: "kube-api-access-x8dfb") pod "9724a435-38f2-4384-b3fe-d5229301866d" (UID: "9724a435-38f2-4384-b3fe-d5229301866d"). InnerVolumeSpecName "kube-api-access-x8dfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.761021 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9724a435-38f2-4384-b3fe-d5229301866d" (UID: "9724a435-38f2-4384-b3fe-d5229301866d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.785779 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9724a435-38f2-4384-b3fe-d5229301866d" (UID: "9724a435-38f2-4384-b3fe-d5229301866d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.792484 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9724a435-38f2-4384-b3fe-d5229301866d" (UID: "9724a435-38f2-4384-b3fe-d5229301866d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.795877 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-inventory" (OuterVolumeSpecName: "inventory") pod "9724a435-38f2-4384-b3fe-d5229301866d" (UID: "9724a435-38f2-4384-b3fe-d5229301866d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.851284 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.851337 4676 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.851354 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8dfb\" (UniqueName: \"kubernetes.io/projected/9724a435-38f2-4384-b3fe-d5229301866d-kube-api-access-x8dfb\") on node \"crc\" DevicePath \"\"" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.851366 4676 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 04 16:02:49 crc kubenswrapper[4676]: I1204 16:02:49.851378 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9724a435-38f2-4384-b3fe-d5229301866d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.067126 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" event={"ID":"9724a435-38f2-4384-b3fe-d5229301866d","Type":"ContainerDied","Data":"61341f7f038293e0a95ec3f006d33bd007f521a56add547dd1608edff68fdea3"} Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.068158 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61341f7f038293e0a95ec3f006d33bd007f521a56add547dd1608edff68fdea3" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.067247 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.187665 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr"] Dec 04 16:02:50 crc kubenswrapper[4676]: E1204 16:02:50.188296 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cd36f16-1d73-423c-918e-7e1e85929fb7" containerName="keystone-cron" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.188333 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cd36f16-1d73-423c-918e-7e1e85929fb7" containerName="keystone-cron" Dec 04 16:02:50 crc kubenswrapper[4676]: E1204 16:02:50.188365 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9724a435-38f2-4384-b3fe-d5229301866d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.188373 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9724a435-38f2-4384-b3fe-d5229301866d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.188636 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9724a435-38f2-4384-b3fe-d5229301866d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.188666 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cd36f16-1d73-423c-918e-7e1e85929fb7" containerName="keystone-cron" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.190219 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.192767 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.192879 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.193166 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.193343 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.193373 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.193584 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.193663 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.201450 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr"] Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260037 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260144 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260180 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260232 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260249 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260309 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cbww\" (UniqueName: \"kubernetes.io/projected/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-kube-api-access-4cbww\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260332 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260357 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.260406 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.362411 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.362469 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.362541 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.363243 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.363459 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cbww\" (UniqueName: \"kubernetes.io/projected/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-kube-api-access-4cbww\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.363502 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.363546 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.363686 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.363766 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.365265 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.368544 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.368601 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.368966 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.369669 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.370056 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.371182 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.376834 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.386947 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cbww\" (UniqueName: \"kubernetes.io/projected/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-kube-api-access-4cbww\") pod \"nova-edpm-deployment-openstack-edpm-ipam-px4sr\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:50 crc kubenswrapper[4676]: I1204 16:02:50.513527 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:02:51 crc kubenswrapper[4676]: I1204 16:02:51.054634 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr"] Dec 04 16:02:51 crc kubenswrapper[4676]: I1204 16:02:51.060764 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:02:51 crc kubenswrapper[4676]: I1204 16:02:51.079055 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" event={"ID":"7841e048-3b6b-4361-a2f5-0d7de2cca7e9","Type":"ContainerStarted","Data":"cd4fbe13e0eefa03d037004b867b1769dd72cce778f2573627091721b080afe6"} Dec 04 16:02:52 crc kubenswrapper[4676]: I1204 16:02:52.115506 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" event={"ID":"7841e048-3b6b-4361-a2f5-0d7de2cca7e9","Type":"ContainerStarted","Data":"d2c5b0b4ee044c2aeab58381b842ccb8fa2c4fc719948aebededaff27226aeca"} Dec 04 16:02:52 crc kubenswrapper[4676]: I1204 16:02:52.146174 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" podStartSLOduration=1.718195153 podStartE2EDuration="2.146144427s" podCreationTimestamp="2025-12-04 16:02:50 +0000 UTC" firstStartedPulling="2025-12-04 16:02:51.060460099 +0000 UTC m=+2578.495129956" lastFinishedPulling="2025-12-04 16:02:51.488409373 +0000 UTC m=+2578.923079230" observedRunningTime="2025-12-04 16:02:52.132176489 +0000 UTC m=+2579.566846356" watchObservedRunningTime="2025-12-04 16:02:52.146144427 +0000 UTC m=+2579.580814274" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.578794 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wpzm4"] Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.581648 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.601548 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpzm4"] Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.605453 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-utilities\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.605562 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n4sx\" (UniqueName: \"kubernetes.io/projected/82a9d606-e475-4142-b734-f7eee88805be-kube-api-access-4n4sx\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.605687 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-catalog-content\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.706981 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n4sx\" (UniqueName: \"kubernetes.io/projected/82a9d606-e475-4142-b734-f7eee88805be-kube-api-access-4n4sx\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.707076 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-catalog-content\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.707172 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-utilities\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.707642 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-utilities\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.707813 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-catalog-content\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.728382 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n4sx\" (UniqueName: \"kubernetes.io/projected/82a9d606-e475-4142-b734-f7eee88805be-kube-api-access-4n4sx\") pod \"redhat-operators-wpzm4\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:06 crc kubenswrapper[4676]: I1204 16:04:06.906020 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:07 crc kubenswrapper[4676]: I1204 16:04:07.448379 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wpzm4"] Dec 04 16:04:08 crc kubenswrapper[4676]: I1204 16:04:08.177043 4676 generic.go:334] "Generic (PLEG): container finished" podID="82a9d606-e475-4142-b734-f7eee88805be" containerID="c3f0f94b08f87ed5e3e56ec5ad1f70e6f13c4506f39c5e5068081cfbff7902cb" exitCode=0 Dec 04 16:04:08 crc kubenswrapper[4676]: I1204 16:04:08.177097 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpzm4" event={"ID":"82a9d606-e475-4142-b734-f7eee88805be","Type":"ContainerDied","Data":"c3f0f94b08f87ed5e3e56ec5ad1f70e6f13c4506f39c5e5068081cfbff7902cb"} Dec 04 16:04:08 crc kubenswrapper[4676]: I1204 16:04:08.177142 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpzm4" event={"ID":"82a9d606-e475-4142-b734-f7eee88805be","Type":"ContainerStarted","Data":"1c1a32abf9126d229b1f45bb5ec60a9d6cd187cb96f47592e3d55ede814b5df6"} Dec 04 16:04:10 crc kubenswrapper[4676]: I1204 16:04:10.197497 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpzm4" event={"ID":"82a9d606-e475-4142-b734-f7eee88805be","Type":"ContainerStarted","Data":"135bb0f07305ad19e631f25c9ac0c9993a62bd020ac5ad0c609afb13f448a280"} Dec 04 16:04:12 crc kubenswrapper[4676]: I1204 16:04:12.229189 4676 generic.go:334] "Generic (PLEG): container finished" podID="82a9d606-e475-4142-b734-f7eee88805be" containerID="135bb0f07305ad19e631f25c9ac0c9993a62bd020ac5ad0c609afb13f448a280" exitCode=0 Dec 04 16:04:12 crc kubenswrapper[4676]: I1204 16:04:12.229296 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpzm4" event={"ID":"82a9d606-e475-4142-b734-f7eee88805be","Type":"ContainerDied","Data":"135bb0f07305ad19e631f25c9ac0c9993a62bd020ac5ad0c609afb13f448a280"} Dec 04 16:04:14 crc kubenswrapper[4676]: I1204 16:04:14.253504 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpzm4" event={"ID":"82a9d606-e475-4142-b734-f7eee88805be","Type":"ContainerStarted","Data":"29388a76b5c189ba03a6cc1a551442a614545805169e69b03404cf8311df29aa"} Dec 04 16:04:14 crc kubenswrapper[4676]: I1204 16:04:14.295025 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wpzm4" podStartSLOduration=3.103621372 podStartE2EDuration="8.294972193s" podCreationTimestamp="2025-12-04 16:04:06 +0000 UTC" firstStartedPulling="2025-12-04 16:04:08.18127648 +0000 UTC m=+2655.615946327" lastFinishedPulling="2025-12-04 16:04:13.372627291 +0000 UTC m=+2660.807297148" observedRunningTime="2025-12-04 16:04:14.285284504 +0000 UTC m=+2661.719954361" watchObservedRunningTime="2025-12-04 16:04:14.294972193 +0000 UTC m=+2661.729642050" Dec 04 16:04:16 crc kubenswrapper[4676]: I1204 16:04:16.027272 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:04:16 crc kubenswrapper[4676]: I1204 16:04:16.027349 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:04:16 crc kubenswrapper[4676]: I1204 16:04:16.907007 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:16 crc kubenswrapper[4676]: I1204 16:04:16.907375 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:17 crc kubenswrapper[4676]: I1204 16:04:17.969264 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wpzm4" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="registry-server" probeResult="failure" output=< Dec 04 16:04:17 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 16:04:17 crc kubenswrapper[4676]: > Dec 04 16:04:27 crc kubenswrapper[4676]: I1204 16:04:27.002971 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:27 crc kubenswrapper[4676]: I1204 16:04:27.090199 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:27 crc kubenswrapper[4676]: I1204 16:04:27.281366 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpzm4"] Dec 04 16:04:28 crc kubenswrapper[4676]: I1204 16:04:28.455272 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wpzm4" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="registry-server" containerID="cri-o://29388a76b5c189ba03a6cc1a551442a614545805169e69b03404cf8311df29aa" gracePeriod=2 Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.469287 4676 generic.go:334] "Generic (PLEG): container finished" podID="82a9d606-e475-4142-b734-f7eee88805be" containerID="29388a76b5c189ba03a6cc1a551442a614545805169e69b03404cf8311df29aa" exitCode=0 Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.469860 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpzm4" event={"ID":"82a9d606-e475-4142-b734-f7eee88805be","Type":"ContainerDied","Data":"29388a76b5c189ba03a6cc1a551442a614545805169e69b03404cf8311df29aa"} Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.469895 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wpzm4" event={"ID":"82a9d606-e475-4142-b734-f7eee88805be","Type":"ContainerDied","Data":"1c1a32abf9126d229b1f45bb5ec60a9d6cd187cb96f47592e3d55ede814b5df6"} Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.469931 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c1a32abf9126d229b1f45bb5ec60a9d6cd187cb96f47592e3d55ede814b5df6" Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.555962 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.708357 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-catalog-content\") pod \"82a9d606-e475-4142-b734-f7eee88805be\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.722568 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n4sx\" (UniqueName: \"kubernetes.io/projected/82a9d606-e475-4142-b734-f7eee88805be-kube-api-access-4n4sx\") pod \"82a9d606-e475-4142-b734-f7eee88805be\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.723007 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-utilities\") pod \"82a9d606-e475-4142-b734-f7eee88805be\" (UID: \"82a9d606-e475-4142-b734-f7eee88805be\") " Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.724330 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-utilities" (OuterVolumeSpecName: "utilities") pod "82a9d606-e475-4142-b734-f7eee88805be" (UID: "82a9d606-e475-4142-b734-f7eee88805be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.728834 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a9d606-e475-4142-b734-f7eee88805be-kube-api-access-4n4sx" (OuterVolumeSpecName: "kube-api-access-4n4sx") pod "82a9d606-e475-4142-b734-f7eee88805be" (UID: "82a9d606-e475-4142-b734-f7eee88805be"). InnerVolumeSpecName "kube-api-access-4n4sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.818395 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82a9d606-e475-4142-b734-f7eee88805be" (UID: "82a9d606-e475-4142-b734-f7eee88805be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.825548 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.825599 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n4sx\" (UniqueName: \"kubernetes.io/projected/82a9d606-e475-4142-b734-f7eee88805be-kube-api-access-4n4sx\") on node \"crc\" DevicePath \"\"" Dec 04 16:04:29 crc kubenswrapper[4676]: I1204 16:04:29.825624 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82a9d606-e475-4142-b734-f7eee88805be-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:04:30 crc kubenswrapper[4676]: I1204 16:04:30.478640 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wpzm4" Dec 04 16:04:30 crc kubenswrapper[4676]: I1204 16:04:30.517295 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wpzm4"] Dec 04 16:04:30 crc kubenswrapper[4676]: I1204 16:04:30.526365 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wpzm4"] Dec 04 16:04:31 crc kubenswrapper[4676]: I1204 16:04:31.441841 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a9d606-e475-4142-b734-f7eee88805be" path="/var/lib/kubelet/pods/82a9d606-e475-4142-b734-f7eee88805be/volumes" Dec 04 16:04:46 crc kubenswrapper[4676]: I1204 16:04:46.026776 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:04:46 crc kubenswrapper[4676]: I1204 16:04:46.027433 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:05:16 crc kubenswrapper[4676]: I1204 16:05:16.026651 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:05:16 crc kubenswrapper[4676]: I1204 16:05:16.027217 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:05:16 crc kubenswrapper[4676]: I1204 16:05:16.027274 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:05:16 crc kubenswrapper[4676]: I1204 16:05:16.028168 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23593ec121879c14847d74f1e6c298bf5947fc489e28bc53b3d892ba8fda12d6"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:05:16 crc kubenswrapper[4676]: I1204 16:05:16.028232 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://23593ec121879c14847d74f1e6c298bf5947fc489e28bc53b3d892ba8fda12d6" gracePeriod=600 Dec 04 16:05:17 crc kubenswrapper[4676]: I1204 16:05:17.206139 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="23593ec121879c14847d74f1e6c298bf5947fc489e28bc53b3d892ba8fda12d6" exitCode=0 Dec 04 16:05:17 crc kubenswrapper[4676]: I1204 16:05:17.206669 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"23593ec121879c14847d74f1e6c298bf5947fc489e28bc53b3d892ba8fda12d6"} Dec 04 16:05:17 crc kubenswrapper[4676]: I1204 16:05:17.206698 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5"} Dec 04 16:05:17 crc kubenswrapper[4676]: I1204 16:05:17.206738 4676 scope.go:117] "RemoveContainer" containerID="a56dca054ecca1c0fa4c414e60a4699b7b474a065e11844a0faef220fb8f2640" Dec 04 16:06:03 crc kubenswrapper[4676]: I1204 16:06:03.641838 4676 generic.go:334] "Generic (PLEG): container finished" podID="7841e048-3b6b-4361-a2f5-0d7de2cca7e9" containerID="d2c5b0b4ee044c2aeab58381b842ccb8fa2c4fc719948aebededaff27226aeca" exitCode=0 Dec 04 16:06:03 crc kubenswrapper[4676]: I1204 16:06:03.641915 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" event={"ID":"7841e048-3b6b-4361-a2f5-0d7de2cca7e9","Type":"ContainerDied","Data":"d2c5b0b4ee044c2aeab58381b842ccb8fa2c4fc719948aebededaff27226aeca"} Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.095847 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225189 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-0\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225275 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-1\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225316 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cbww\" (UniqueName: \"kubernetes.io/projected/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-kube-api-access-4cbww\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225347 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-combined-ca-bundle\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225375 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-1\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225432 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-inventory\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225472 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-0\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225518 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-extra-config-0\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.225611 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-ssh-key\") pod \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\" (UID: \"7841e048-3b6b-4361-a2f5-0d7de2cca7e9\") " Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.233796 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.238315 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-kube-api-access-4cbww" (OuterVolumeSpecName: "kube-api-access-4cbww") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "kube-api-access-4cbww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.258755 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.260167 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.260393 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.270254 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.272578 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.273690 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-inventory" (OuterVolumeSpecName: "inventory") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.277461 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7841e048-3b6b-4361-a2f5-0d7de2cca7e9" (UID: "7841e048-3b6b-4361-a2f5-0d7de2cca7e9"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.328922 4676 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329261 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329278 4676 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329292 4676 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329304 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329317 4676 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329328 4676 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329339 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cbww\" (UniqueName: \"kubernetes.io/projected/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-kube-api-access-4cbww\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.329350 4676 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7841e048-3b6b-4361-a2f5-0d7de2cca7e9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.667069 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" event={"ID":"7841e048-3b6b-4361-a2f5-0d7de2cca7e9","Type":"ContainerDied","Data":"cd4fbe13e0eefa03d037004b867b1769dd72cce778f2573627091721b080afe6"} Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.667477 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd4fbe13e0eefa03d037004b867b1769dd72cce778f2573627091721b080afe6" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.667185 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-px4sr" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.775531 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p"] Dec 04 16:06:05 crc kubenswrapper[4676]: E1204 16:06:05.776078 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="extract-utilities" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.776109 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="extract-utilities" Dec 04 16:06:05 crc kubenswrapper[4676]: E1204 16:06:05.776136 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="extract-content" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.776149 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="extract-content" Dec 04 16:06:05 crc kubenswrapper[4676]: E1204 16:06:05.776169 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="registry-server" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.776179 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="registry-server" Dec 04 16:06:05 crc kubenswrapper[4676]: E1204 16:06:05.776195 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7841e048-3b6b-4361-a2f5-0d7de2cca7e9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.776201 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7841e048-3b6b-4361-a2f5-0d7de2cca7e9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.776452 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a9d606-e475-4142-b734-f7eee88805be" containerName="registry-server" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.776481 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7841e048-3b6b-4361-a2f5-0d7de2cca7e9" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.777479 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.779599 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.780463 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.780482 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-7dc5t" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.781155 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.782518 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.799171 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p"] Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.941117 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.941173 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.941211 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.941341 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4pp8\" (UniqueName: \"kubernetes.io/projected/739e4574-6964-41c1-833b-3379e794681a-kube-api-access-v4pp8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.941411 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.941447 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:05 crc kubenswrapper[4676]: I1204 16:06:05.941523 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.042831 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4pp8\" (UniqueName: \"kubernetes.io/projected/739e4574-6964-41c1-833b-3379e794681a-kube-api-access-v4pp8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.042939 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.042968 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.043028 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.043104 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.043122 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.043143 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.047430 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.047764 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.047837 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.048983 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.051004 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.054450 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.064953 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4pp8\" (UniqueName: \"kubernetes.io/projected/739e4574-6964-41c1-833b-3379e794681a-kube-api-access-v4pp8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.101176 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.605848 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p"] Dec 04 16:06:06 crc kubenswrapper[4676]: I1204 16:06:06.681753 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" event={"ID":"739e4574-6964-41c1-833b-3379e794681a","Type":"ContainerStarted","Data":"44ab57c30795506302724a55c67b4e55f5be2bf95ae55574a25087a68a0c5cc0"} Dec 04 16:06:07 crc kubenswrapper[4676]: I1204 16:06:07.695038 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" event={"ID":"739e4574-6964-41c1-833b-3379e794681a","Type":"ContainerStarted","Data":"4ab2aa777f510a91c419fa2ec59a6a66fc82fc22d56c9f85bf42426400097357"} Dec 04 16:06:07 crc kubenswrapper[4676]: I1204 16:06:07.716714 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" podStartSLOduration=2.236714163 podStartE2EDuration="2.716667592s" podCreationTimestamp="2025-12-04 16:06:05 +0000 UTC" firstStartedPulling="2025-12-04 16:06:06.611113751 +0000 UTC m=+2774.045783608" lastFinishedPulling="2025-12-04 16:06:07.09106718 +0000 UTC m=+2774.525737037" observedRunningTime="2025-12-04 16:06:07.711132118 +0000 UTC m=+2775.145802005" watchObservedRunningTime="2025-12-04 16:06:07.716667592 +0000 UTC m=+2775.151337469" Dec 04 16:07:16 crc kubenswrapper[4676]: I1204 16:07:16.027090 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:07:16 crc kubenswrapper[4676]: I1204 16:07:16.027569 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.502222 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dd8bp"] Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.504785 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.523334 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dd8bp"] Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.688055 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-utilities\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.688555 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-catalog-content\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.688709 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82rlc\" (UniqueName: \"kubernetes.io/projected/28063a31-4486-49db-9562-331dec0a5349-kube-api-access-82rlc\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.791491 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-catalog-content\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.791809 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82rlc\" (UniqueName: \"kubernetes.io/projected/28063a31-4486-49db-9562-331dec0a5349-kube-api-access-82rlc\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.791926 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-utilities\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.792242 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-catalog-content\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.792412 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-utilities\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.814394 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82rlc\" (UniqueName: \"kubernetes.io/projected/28063a31-4486-49db-9562-331dec0a5349-kube-api-access-82rlc\") pod \"certified-operators-dd8bp\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:17 crc kubenswrapper[4676]: I1204 16:07:17.827132 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:18 crc kubenswrapper[4676]: I1204 16:07:18.379462 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dd8bp"] Dec 04 16:07:18 crc kubenswrapper[4676]: I1204 16:07:18.508238 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd8bp" event={"ID":"28063a31-4486-49db-9562-331dec0a5349","Type":"ContainerStarted","Data":"8929fa4841cb1dcc66190698cadb802e4621343fda8906ee4287f44332116c66"} Dec 04 16:07:20 crc kubenswrapper[4676]: I1204 16:07:20.530468 4676 generic.go:334] "Generic (PLEG): container finished" podID="28063a31-4486-49db-9562-331dec0a5349" containerID="d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2" exitCode=0 Dec 04 16:07:20 crc kubenswrapper[4676]: I1204 16:07:20.530658 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd8bp" event={"ID":"28063a31-4486-49db-9562-331dec0a5349","Type":"ContainerDied","Data":"d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2"} Dec 04 16:07:27 crc kubenswrapper[4676]: I1204 16:07:27.601754 4676 generic.go:334] "Generic (PLEG): container finished" podID="28063a31-4486-49db-9562-331dec0a5349" containerID="80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e" exitCode=0 Dec 04 16:07:27 crc kubenswrapper[4676]: I1204 16:07:27.601830 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd8bp" event={"ID":"28063a31-4486-49db-9562-331dec0a5349","Type":"ContainerDied","Data":"80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e"} Dec 04 16:07:28 crc kubenswrapper[4676]: I1204 16:07:28.614488 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd8bp" event={"ID":"28063a31-4486-49db-9562-331dec0a5349","Type":"ContainerStarted","Data":"5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4"} Dec 04 16:07:28 crc kubenswrapper[4676]: I1204 16:07:28.640264 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dd8bp" podStartSLOduration=3.855217121 podStartE2EDuration="11.640224508s" podCreationTimestamp="2025-12-04 16:07:17 +0000 UTC" firstStartedPulling="2025-12-04 16:07:20.532614463 +0000 UTC m=+2847.967284320" lastFinishedPulling="2025-12-04 16:07:28.31762184 +0000 UTC m=+2855.752291707" observedRunningTime="2025-12-04 16:07:28.629867201 +0000 UTC m=+2856.064537058" watchObservedRunningTime="2025-12-04 16:07:28.640224508 +0000 UTC m=+2856.074894375" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.458884 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bdqw6"] Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.461655 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.472885 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdqw6"] Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.659892 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-utilities\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.660009 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6h6\" (UniqueName: \"kubernetes.io/projected/d7e84100-7fd9-4f9e-9229-c46af09c005e-kube-api-access-ms6h6\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.660088 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-catalog-content\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.762126 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-utilities\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.762305 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6h6\" (UniqueName: \"kubernetes.io/projected/d7e84100-7fd9-4f9e-9229-c46af09c005e-kube-api-access-ms6h6\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.762406 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-catalog-content\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.762625 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-utilities\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.762924 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-catalog-content\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.784269 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6h6\" (UniqueName: \"kubernetes.io/projected/d7e84100-7fd9-4f9e-9229-c46af09c005e-kube-api-access-ms6h6\") pod \"community-operators-bdqw6\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:34 crc kubenswrapper[4676]: I1204 16:07:34.784758 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:35 crc kubenswrapper[4676]: W1204 16:07:35.356538 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7e84100_7fd9_4f9e_9229_c46af09c005e.slice/crio-6a4f5de478070919fd1292e01c8cb94d4a4399b16ac45a9e12886394d13ba892 WatchSource:0}: Error finding container 6a4f5de478070919fd1292e01c8cb94d4a4399b16ac45a9e12886394d13ba892: Status 404 returned error can't find the container with id 6a4f5de478070919fd1292e01c8cb94d4a4399b16ac45a9e12886394d13ba892 Dec 04 16:07:35 crc kubenswrapper[4676]: I1204 16:07:35.358056 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bdqw6"] Dec 04 16:07:35 crc kubenswrapper[4676]: I1204 16:07:35.722029 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqw6" event={"ID":"d7e84100-7fd9-4f9e-9229-c46af09c005e","Type":"ContainerStarted","Data":"6a4f5de478070919fd1292e01c8cb94d4a4399b16ac45a9e12886394d13ba892"} Dec 04 16:07:37 crc kubenswrapper[4676]: I1204 16:07:37.827717 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:37 crc kubenswrapper[4676]: I1204 16:07:37.828009 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:37 crc kubenswrapper[4676]: I1204 16:07:37.883654 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:38 crc kubenswrapper[4676]: I1204 16:07:38.755005 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqw6" event={"ID":"d7e84100-7fd9-4f9e-9229-c46af09c005e","Type":"ContainerStarted","Data":"2afc409358cfee0898c8ac2bd47a2104f82d9d6db8c4c3dff03758b954ca24eb"} Dec 04 16:07:38 crc kubenswrapper[4676]: I1204 16:07:38.799976 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:07:39 crc kubenswrapper[4676]: I1204 16:07:39.768046 4676 generic.go:334] "Generic (PLEG): container finished" podID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerID="2afc409358cfee0898c8ac2bd47a2104f82d9d6db8c4c3dff03758b954ca24eb" exitCode=0 Dec 04 16:07:39 crc kubenswrapper[4676]: I1204 16:07:39.768135 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqw6" event={"ID":"d7e84100-7fd9-4f9e-9229-c46af09c005e","Type":"ContainerDied","Data":"2afc409358cfee0898c8ac2bd47a2104f82d9d6db8c4c3dff03758b954ca24eb"} Dec 04 16:07:42 crc kubenswrapper[4676]: I1204 16:07:42.672651 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dd8bp"] Dec 04 16:07:42 crc kubenswrapper[4676]: I1204 16:07:42.810618 4676 generic.go:334] "Generic (PLEG): container finished" podID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerID="8b91020abe4ae4ea0ca2ad41f69d737ef5b18e7b1d73edc711174159bff13424" exitCode=0 Dec 04 16:07:42 crc kubenswrapper[4676]: I1204 16:07:42.810689 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqw6" event={"ID":"d7e84100-7fd9-4f9e-9229-c46af09c005e","Type":"ContainerDied","Data":"8b91020abe4ae4ea0ca2ad41f69d737ef5b18e7b1d73edc711174159bff13424"} Dec 04 16:07:43 crc kubenswrapper[4676]: I1204 16:07:43.452258 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdnsz"] Dec 04 16:07:43 crc kubenswrapper[4676]: I1204 16:07:43.452797 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gdnsz" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="registry-server" containerID="cri-o://416ca805cb14fb557246da2a611333b84b335e440cd1a780d6e3d0633893b54e" gracePeriod=2 Dec 04 16:07:46 crc kubenswrapper[4676]: I1204 16:07:46.027265 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:07:46 crc kubenswrapper[4676]: I1204 16:07:46.027575 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:07:48 crc kubenswrapper[4676]: I1204 16:07:48.969414 4676 generic.go:334] "Generic (PLEG): container finished" podID="aebba73c-4263-4e22-a922-de02e092f260" containerID="416ca805cb14fb557246da2a611333b84b335e440cd1a780d6e3d0633893b54e" exitCode=0 Dec 04 16:07:48 crc kubenswrapper[4676]: I1204 16:07:48.969487 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdnsz" event={"ID":"aebba73c-4263-4e22-a922-de02e092f260","Type":"ContainerDied","Data":"416ca805cb14fb557246da2a611333b84b335e440cd1a780d6e3d0633893b54e"} Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.129767 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.269048 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-catalog-content\") pod \"aebba73c-4263-4e22-a922-de02e092f260\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.269291 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wq2v\" (UniqueName: \"kubernetes.io/projected/aebba73c-4263-4e22-a922-de02e092f260-kube-api-access-5wq2v\") pod \"aebba73c-4263-4e22-a922-de02e092f260\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.269410 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-utilities\") pod \"aebba73c-4263-4e22-a922-de02e092f260\" (UID: \"aebba73c-4263-4e22-a922-de02e092f260\") " Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.270086 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-utilities" (OuterVolumeSpecName: "utilities") pod "aebba73c-4263-4e22-a922-de02e092f260" (UID: "aebba73c-4263-4e22-a922-de02e092f260"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.275806 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aebba73c-4263-4e22-a922-de02e092f260-kube-api-access-5wq2v" (OuterVolumeSpecName: "kube-api-access-5wq2v") pod "aebba73c-4263-4e22-a922-de02e092f260" (UID: "aebba73c-4263-4e22-a922-de02e092f260"). InnerVolumeSpecName "kube-api-access-5wq2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.316890 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aebba73c-4263-4e22-a922-de02e092f260" (UID: "aebba73c-4263-4e22-a922-de02e092f260"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.372046 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.372089 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wq2v\" (UniqueName: \"kubernetes.io/projected/aebba73c-4263-4e22-a922-de02e092f260-kube-api-access-5wq2v\") on node \"crc\" DevicePath \"\"" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.372104 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aebba73c-4263-4e22-a922-de02e092f260-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.983032 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gdnsz" event={"ID":"aebba73c-4263-4e22-a922-de02e092f260","Type":"ContainerDied","Data":"f928b2470c100f0520746383020a9dcf2e8bbee65b417990b8664841ac08d6a7"} Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.983098 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gdnsz" Dec 04 16:07:49 crc kubenswrapper[4676]: I1204 16:07:49.983148 4676 scope.go:117] "RemoveContainer" containerID="416ca805cb14fb557246da2a611333b84b335e440cd1a780d6e3d0633893b54e" Dec 04 16:07:50 crc kubenswrapper[4676]: I1204 16:07:50.016170 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gdnsz"] Dec 04 16:07:50 crc kubenswrapper[4676]: I1204 16:07:50.025756 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gdnsz"] Dec 04 16:07:50 crc kubenswrapper[4676]: I1204 16:07:50.360245 4676 scope.go:117] "RemoveContainer" containerID="431da6b8d0b69f4cc44f223523399a8024f11c4b3bcaae7e6d66304e181ca45f" Dec 04 16:07:50 crc kubenswrapper[4676]: I1204 16:07:50.475639 4676 scope.go:117] "RemoveContainer" containerID="8c5205e31092b924aa4e81ff1395c807de46a3bf4622b47fbba7a1627e466418" Dec 04 16:07:51 crc kubenswrapper[4676]: I1204 16:07:51.397484 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aebba73c-4263-4e22-a922-de02e092f260" path="/var/lib/kubelet/pods/aebba73c-4263-4e22-a922-de02e092f260/volumes" Dec 04 16:07:52 crc kubenswrapper[4676]: I1204 16:07:52.007043 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqw6" event={"ID":"d7e84100-7fd9-4f9e-9229-c46af09c005e","Type":"ContainerStarted","Data":"4d4f6e9c6a466186a989101b3403a40d4afcbdd3e84effc36af36f46f6002c8d"} Dec 04 16:07:52 crc kubenswrapper[4676]: I1204 16:07:52.034141 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bdqw6" podStartSLOduration=7.084932077 podStartE2EDuration="18.033996401s" podCreationTimestamp="2025-12-04 16:07:34 +0000 UTC" firstStartedPulling="2025-12-04 16:07:39.771157863 +0000 UTC m=+2867.205827720" lastFinishedPulling="2025-12-04 16:07:50.720222187 +0000 UTC m=+2878.154892044" observedRunningTime="2025-12-04 16:07:52.026428214 +0000 UTC m=+2879.461098081" watchObservedRunningTime="2025-12-04 16:07:52.033996401 +0000 UTC m=+2879.468666258" Dec 04 16:07:54 crc kubenswrapper[4676]: I1204 16:07:54.786067 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:54 crc kubenswrapper[4676]: I1204 16:07:54.786610 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:54 crc kubenswrapper[4676]: I1204 16:07:54.840417 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:56 crc kubenswrapper[4676]: I1204 16:07:56.098239 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:57 crc kubenswrapper[4676]: I1204 16:07:57.250469 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdqw6"] Dec 04 16:07:58 crc kubenswrapper[4676]: I1204 16:07:58.084953 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bdqw6" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="registry-server" containerID="cri-o://4d4f6e9c6a466186a989101b3403a40d4afcbdd3e84effc36af36f46f6002c8d" gracePeriod=2 Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.106694 4676 generic.go:334] "Generic (PLEG): container finished" podID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerID="4d4f6e9c6a466186a989101b3403a40d4afcbdd3e84effc36af36f46f6002c8d" exitCode=0 Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.107299 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqw6" event={"ID":"d7e84100-7fd9-4f9e-9229-c46af09c005e","Type":"ContainerDied","Data":"4d4f6e9c6a466186a989101b3403a40d4afcbdd3e84effc36af36f46f6002c8d"} Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.107333 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bdqw6" event={"ID":"d7e84100-7fd9-4f9e-9229-c46af09c005e","Type":"ContainerDied","Data":"6a4f5de478070919fd1292e01c8cb94d4a4399b16ac45a9e12886394d13ba892"} Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.107356 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a4f5de478070919fd1292e01c8cb94d4a4399b16ac45a9e12886394d13ba892" Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.170980 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.282393 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-catalog-content\") pod \"d7e84100-7fd9-4f9e-9229-c46af09c005e\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.282460 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms6h6\" (UniqueName: \"kubernetes.io/projected/d7e84100-7fd9-4f9e-9229-c46af09c005e-kube-api-access-ms6h6\") pod \"d7e84100-7fd9-4f9e-9229-c46af09c005e\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.282541 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-utilities\") pod \"d7e84100-7fd9-4f9e-9229-c46af09c005e\" (UID: \"d7e84100-7fd9-4f9e-9229-c46af09c005e\") " Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.283544 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-utilities" (OuterVolumeSpecName: "utilities") pod "d7e84100-7fd9-4f9e-9229-c46af09c005e" (UID: "d7e84100-7fd9-4f9e-9229-c46af09c005e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.289558 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e84100-7fd9-4f9e-9229-c46af09c005e-kube-api-access-ms6h6" (OuterVolumeSpecName: "kube-api-access-ms6h6") pod "d7e84100-7fd9-4f9e-9229-c46af09c005e" (UID: "d7e84100-7fd9-4f9e-9229-c46af09c005e"). InnerVolumeSpecName "kube-api-access-ms6h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.345766 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7e84100-7fd9-4f9e-9229-c46af09c005e" (UID: "d7e84100-7fd9-4f9e-9229-c46af09c005e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.385953 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms6h6\" (UniqueName: \"kubernetes.io/projected/d7e84100-7fd9-4f9e-9229-c46af09c005e-kube-api-access-ms6h6\") on node \"crc\" DevicePath \"\"" Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.386008 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:07:59 crc kubenswrapper[4676]: I1204 16:07:59.386019 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7e84100-7fd9-4f9e-9229-c46af09c005e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:00 crc kubenswrapper[4676]: I1204 16:08:00.115665 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bdqw6" Dec 04 16:08:00 crc kubenswrapper[4676]: I1204 16:08:00.142089 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bdqw6"] Dec 04 16:08:00 crc kubenswrapper[4676]: I1204 16:08:00.151962 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bdqw6"] Dec 04 16:08:01 crc kubenswrapper[4676]: I1204 16:08:01.400022 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" path="/var/lib/kubelet/pods/d7e84100-7fd9-4f9e-9229-c46af09c005e/volumes" Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.026754 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.027309 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.027365 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.028245 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.028319 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" gracePeriod=600 Dec 04 16:08:16 crc kubenswrapper[4676]: E1204 16:08:16.149103 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.516947 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" exitCode=0 Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.517010 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5"} Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.517103 4676 scope.go:117] "RemoveContainer" containerID="23593ec121879c14847d74f1e6c298bf5947fc489e28bc53b3d892ba8fda12d6" Dec 04 16:08:16 crc kubenswrapper[4676]: I1204 16:08:16.517854 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:08:16 crc kubenswrapper[4676]: E1204 16:08:16.518160 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:08:29 crc kubenswrapper[4676]: I1204 16:08:29.384302 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:08:29 crc kubenswrapper[4676]: E1204 16:08:29.385189 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:08:43 crc kubenswrapper[4676]: I1204 16:08:43.878069 4676 generic.go:334] "Generic (PLEG): container finished" podID="739e4574-6964-41c1-833b-3379e794681a" containerID="4ab2aa777f510a91c419fa2ec59a6a66fc82fc22d56c9f85bf42426400097357" exitCode=0 Dec 04 16:08:43 crc kubenswrapper[4676]: I1204 16:08:43.878191 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" event={"ID":"739e4574-6964-41c1-833b-3379e794681a","Type":"ContainerDied","Data":"4ab2aa777f510a91c419fa2ec59a6a66fc82fc22d56c9f85bf42426400097357"} Dec 04 16:08:44 crc kubenswrapper[4676]: I1204 16:08:44.384686 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:08:44 crc kubenswrapper[4676]: E1204 16:08:44.385150 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.278969 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.476731 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ssh-key\") pod \"739e4574-6964-41c1-833b-3379e794681a\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.477110 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-2\") pod \"739e4574-6964-41c1-833b-3379e794681a\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.477219 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-telemetry-combined-ca-bundle\") pod \"739e4574-6964-41c1-833b-3379e794681a\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.478025 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-inventory\") pod \"739e4574-6964-41c1-833b-3379e794681a\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.478071 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4pp8\" (UniqueName: \"kubernetes.io/projected/739e4574-6964-41c1-833b-3379e794681a-kube-api-access-v4pp8\") pod \"739e4574-6964-41c1-833b-3379e794681a\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.478143 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-0\") pod \"739e4574-6964-41c1-833b-3379e794681a\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.478183 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-1\") pod \"739e4574-6964-41c1-833b-3379e794681a\" (UID: \"739e4574-6964-41c1-833b-3379e794681a\") " Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.482903 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/739e4574-6964-41c1-833b-3379e794681a-kube-api-access-v4pp8" (OuterVolumeSpecName: "kube-api-access-v4pp8") pod "739e4574-6964-41c1-833b-3379e794681a" (UID: "739e4574-6964-41c1-833b-3379e794681a"). InnerVolumeSpecName "kube-api-access-v4pp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.483581 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "739e4574-6964-41c1-833b-3379e794681a" (UID: "739e4574-6964-41c1-833b-3379e794681a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.507781 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "739e4574-6964-41c1-833b-3379e794681a" (UID: "739e4574-6964-41c1-833b-3379e794681a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.508854 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-inventory" (OuterVolumeSpecName: "inventory") pod "739e4574-6964-41c1-833b-3379e794681a" (UID: "739e4574-6964-41c1-833b-3379e794681a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.510359 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "739e4574-6964-41c1-833b-3379e794681a" (UID: "739e4574-6964-41c1-833b-3379e794681a"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.510743 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "739e4574-6964-41c1-833b-3379e794681a" (UID: "739e4574-6964-41c1-833b-3379e794681a"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.516459 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "739e4574-6964-41c1-833b-3379e794681a" (UID: "739e4574-6964-41c1-833b-3379e794681a"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.580331 4676 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.580375 4676 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-inventory\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.580387 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4pp8\" (UniqueName: \"kubernetes.io/projected/739e4574-6964-41c1-833b-3379e794681a-kube-api-access-v4pp8\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.580400 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.580414 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.580434 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.580447 4676 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/739e4574-6964-41c1-833b-3379e794681a-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.898466 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" event={"ID":"739e4574-6964-41c1-833b-3379e794681a","Type":"ContainerDied","Data":"44ab57c30795506302724a55c67b4e55f5be2bf95ae55574a25087a68a0c5cc0"} Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.898508 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44ab57c30795506302724a55c67b4e55f5be2bf95ae55574a25087a68a0c5cc0" Dec 04 16:08:45 crc kubenswrapper[4676]: I1204 16:08:45.898547 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p" Dec 04 16:08:55 crc kubenswrapper[4676]: I1204 16:08:55.384683 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:08:55 crc kubenswrapper[4676]: E1204 16:08:55.385441 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.083395 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6msp5"] Dec 04 16:08:58 crc kubenswrapper[4676]: E1204 16:08:58.085433 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="extract-content" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.085580 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="extract-content" Dec 04 16:08:58 crc kubenswrapper[4676]: E1204 16:08:58.085683 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="extract-utilities" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.085767 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="extract-utilities" Dec 04 16:08:58 crc kubenswrapper[4676]: E1204 16:08:58.085862 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="registry-server" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.085960 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="registry-server" Dec 04 16:08:58 crc kubenswrapper[4676]: E1204 16:08:58.086050 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="extract-utilities" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.086123 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="extract-utilities" Dec 04 16:08:58 crc kubenswrapper[4676]: E1204 16:08:58.086201 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="extract-content" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.086279 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="extract-content" Dec 04 16:08:58 crc kubenswrapper[4676]: E1204 16:08:58.086363 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="739e4574-6964-41c1-833b-3379e794681a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.086437 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="739e4574-6964-41c1-833b-3379e794681a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 16:08:58 crc kubenswrapper[4676]: E1204 16:08:58.086533 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="registry-server" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.086603 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="registry-server" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.087033 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="739e4574-6964-41c1-833b-3379e794681a" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.087157 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="aebba73c-4263-4e22-a922-de02e092f260" containerName="registry-server" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.087241 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e84100-7fd9-4f9e-9229-c46af09c005e" containerName="registry-server" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.088988 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.100086 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6msp5"] Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.261982 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-utilities\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.262054 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-catalog-content\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.262526 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpv87\" (UniqueName: \"kubernetes.io/projected/5442582f-4481-43c8-9fb5-701b47cb5674-kube-api-access-lpv87\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.364223 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpv87\" (UniqueName: \"kubernetes.io/projected/5442582f-4481-43c8-9fb5-701b47cb5674-kube-api-access-lpv87\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.364359 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-utilities\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.364387 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-catalog-content\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.364833 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-catalog-content\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.365420 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-utilities\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.396776 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpv87\" (UniqueName: \"kubernetes.io/projected/5442582f-4481-43c8-9fb5-701b47cb5674-kube-api-access-lpv87\") pod \"redhat-marketplace-6msp5\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.430026 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:08:58 crc kubenswrapper[4676]: I1204 16:08:58.965650 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6msp5"] Dec 04 16:08:59 crc kubenswrapper[4676]: I1204 16:08:59.345399 4676 generic.go:334] "Generic (PLEG): container finished" podID="5442582f-4481-43c8-9fb5-701b47cb5674" containerID="a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2" exitCode=0 Dec 04 16:08:59 crc kubenswrapper[4676]: I1204 16:08:59.345508 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6msp5" event={"ID":"5442582f-4481-43c8-9fb5-701b47cb5674","Type":"ContainerDied","Data":"a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2"} Dec 04 16:08:59 crc kubenswrapper[4676]: I1204 16:08:59.345562 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6msp5" event={"ID":"5442582f-4481-43c8-9fb5-701b47cb5674","Type":"ContainerStarted","Data":"77e8c79e4b0d1bf66f5af83620f566a8160f4854c0d2a97306fd5b36f684369d"} Dec 04 16:08:59 crc kubenswrapper[4676]: I1204 16:08:59.347922 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:09:00 crc kubenswrapper[4676]: I1204 16:09:00.356226 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6msp5" event={"ID":"5442582f-4481-43c8-9fb5-701b47cb5674","Type":"ContainerStarted","Data":"eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5"} Dec 04 16:09:01 crc kubenswrapper[4676]: I1204 16:09:01.372483 4676 generic.go:334] "Generic (PLEG): container finished" podID="5442582f-4481-43c8-9fb5-701b47cb5674" containerID="eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5" exitCode=0 Dec 04 16:09:01 crc kubenswrapper[4676]: I1204 16:09:01.372533 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6msp5" event={"ID":"5442582f-4481-43c8-9fb5-701b47cb5674","Type":"ContainerDied","Data":"eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5"} Dec 04 16:09:02 crc kubenswrapper[4676]: I1204 16:09:02.388107 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6msp5" event={"ID":"5442582f-4481-43c8-9fb5-701b47cb5674","Type":"ContainerStarted","Data":"eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015"} Dec 04 16:09:03 crc kubenswrapper[4676]: I1204 16:09:03.420423 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6msp5" podStartSLOduration=2.960471351 podStartE2EDuration="5.420382808s" podCreationTimestamp="2025-12-04 16:08:58 +0000 UTC" firstStartedPulling="2025-12-04 16:08:59.347503172 +0000 UTC m=+2946.782173029" lastFinishedPulling="2025-12-04 16:09:01.807414629 +0000 UTC m=+2949.242084486" observedRunningTime="2025-12-04 16:09:03.415300422 +0000 UTC m=+2950.849970279" watchObservedRunningTime="2025-12-04 16:09:03.420382808 +0000 UTC m=+2950.855052665" Dec 04 16:09:08 crc kubenswrapper[4676]: I1204 16:09:08.431618 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:09:08 crc kubenswrapper[4676]: I1204 16:09:08.432180 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:09:08 crc kubenswrapper[4676]: I1204 16:09:08.478021 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:09:08 crc kubenswrapper[4676]: I1204 16:09:08.751118 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:09:08 crc kubenswrapper[4676]: I1204 16:09:08.809238 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6msp5"] Dec 04 16:09:10 crc kubenswrapper[4676]: I1204 16:09:10.385060 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:09:10 crc kubenswrapper[4676]: E1204 16:09:10.385625 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:09:10 crc kubenswrapper[4676]: I1204 16:09:10.718456 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6msp5" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="registry-server" containerID="cri-o://eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015" gracePeriod=2 Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.204703 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.341897 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-utilities\") pod \"5442582f-4481-43c8-9fb5-701b47cb5674\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.341987 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-catalog-content\") pod \"5442582f-4481-43c8-9fb5-701b47cb5674\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.342142 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpv87\" (UniqueName: \"kubernetes.io/projected/5442582f-4481-43c8-9fb5-701b47cb5674-kube-api-access-lpv87\") pod \"5442582f-4481-43c8-9fb5-701b47cb5674\" (UID: \"5442582f-4481-43c8-9fb5-701b47cb5674\") " Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.343536 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-utilities" (OuterVolumeSpecName: "utilities") pod "5442582f-4481-43c8-9fb5-701b47cb5674" (UID: "5442582f-4481-43c8-9fb5-701b47cb5674"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.349280 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5442582f-4481-43c8-9fb5-701b47cb5674-kube-api-access-lpv87" (OuterVolumeSpecName: "kube-api-access-lpv87") pod "5442582f-4481-43c8-9fb5-701b47cb5674" (UID: "5442582f-4481-43c8-9fb5-701b47cb5674"). InnerVolumeSpecName "kube-api-access-lpv87". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.411103 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5442582f-4481-43c8-9fb5-701b47cb5674" (UID: "5442582f-4481-43c8-9fb5-701b47cb5674"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.507508 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpv87\" (UniqueName: \"kubernetes.io/projected/5442582f-4481-43c8-9fb5-701b47cb5674-kube-api-access-lpv87\") on node \"crc\" DevicePath \"\"" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.507533 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.507545 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5442582f-4481-43c8-9fb5-701b47cb5674-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.730549 4676 generic.go:334] "Generic (PLEG): container finished" podID="5442582f-4481-43c8-9fb5-701b47cb5674" containerID="eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015" exitCode=0 Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.730900 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6msp5" event={"ID":"5442582f-4481-43c8-9fb5-701b47cb5674","Type":"ContainerDied","Data":"eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015"} Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.730966 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6msp5" event={"ID":"5442582f-4481-43c8-9fb5-701b47cb5674","Type":"ContainerDied","Data":"77e8c79e4b0d1bf66f5af83620f566a8160f4854c0d2a97306fd5b36f684369d"} Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.730988 4676 scope.go:117] "RemoveContainer" containerID="eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.731190 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6msp5" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.768973 4676 scope.go:117] "RemoveContainer" containerID="eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.775630 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6msp5"] Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.785446 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6msp5"] Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.791131 4676 scope.go:117] "RemoveContainer" containerID="a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.840389 4676 scope.go:117] "RemoveContainer" containerID="eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015" Dec 04 16:09:11 crc kubenswrapper[4676]: E1204 16:09:11.840943 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015\": container with ID starting with eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015 not found: ID does not exist" containerID="eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.840991 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015"} err="failed to get container status \"eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015\": rpc error: code = NotFound desc = could not find container \"eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015\": container with ID starting with eb0fed7b2f4e82148901d535625715be0446a379c33394c3043288c91c1ea015 not found: ID does not exist" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.841018 4676 scope.go:117] "RemoveContainer" containerID="eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5" Dec 04 16:09:11 crc kubenswrapper[4676]: E1204 16:09:11.841511 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5\": container with ID starting with eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5 not found: ID does not exist" containerID="eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.841558 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5"} err="failed to get container status \"eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5\": rpc error: code = NotFound desc = could not find container \"eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5\": container with ID starting with eb43d2a744fff24c88e1d21289869c819a8393b5717128c3f3dbf39c51b073a5 not found: ID does not exist" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.841587 4676 scope.go:117] "RemoveContainer" containerID="a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2" Dec 04 16:09:11 crc kubenswrapper[4676]: E1204 16:09:11.841923 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2\": container with ID starting with a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2 not found: ID does not exist" containerID="a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2" Dec 04 16:09:11 crc kubenswrapper[4676]: I1204 16:09:11.841972 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2"} err="failed to get container status \"a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2\": rpc error: code = NotFound desc = could not find container \"a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2\": container with ID starting with a87911894bc774d95676c6ef93900b51b727f0efe8449d69235c7c019cd727b2 not found: ID does not exist" Dec 04 16:09:13 crc kubenswrapper[4676]: I1204 16:09:13.397211 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" path="/var/lib/kubelet/pods/5442582f-4481-43c8-9fb5-701b47cb5674/volumes" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.802717 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 04 16:09:22 crc kubenswrapper[4676]: E1204 16:09:22.804318 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="extract-content" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.804338 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="extract-content" Dec 04 16:09:22 crc kubenswrapper[4676]: E1204 16:09:22.804354 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="extract-utilities" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.804360 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="extract-utilities" Dec 04 16:09:22 crc kubenswrapper[4676]: E1204 16:09:22.804378 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="registry-server" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.804384 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="registry-server" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.804609 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5442582f-4481-43c8-9fb5-701b47cb5674" containerName="registry-server" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.805730 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.808164 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.817375 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845109 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-scripts\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845192 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-lib-modules\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845216 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845251 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845304 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-dev\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845337 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845399 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845459 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845490 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845794 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-config-data\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845867 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845955 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srdbb\" (UniqueName: \"kubernetes.io/projected/4824604f-7b99-455c-be80-b8410dc47264-kube-api-access-srdbb\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.845997 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.846023 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-run\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.846043 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-sys\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.902321 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.904281 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.906615 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.932659 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.947728 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-lib-modules\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.947778 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.947819 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.947846 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.947897 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw57n\" (UniqueName: \"kubernetes.io/projected/9572f37c-801d-4ea4-acfe-4ad3be15946a-kube-api-access-jw57n\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.947949 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.947978 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-dev\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948006 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948062 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948115 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948142 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948173 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948197 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948223 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948240 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948260 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948277 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948310 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948354 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948389 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-config-data\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948431 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948478 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srdbb\" (UniqueName: \"kubernetes.io/projected/4824604f-7b99-455c-be80-b8410dc47264-kube-api-access-srdbb\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948563 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948593 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-run\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948617 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-sys\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948666 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948688 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948705 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948739 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.948774 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-scripts\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.949658 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-lib-modules\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.949793 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.953955 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.954032 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-run\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.954057 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-sys\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.954111 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.955108 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.955445 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.955630 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-dev\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.955817 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4824604f-7b99-455c-be80-b8410dc47264-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.957583 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-config-data\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.961494 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.962751 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.962951 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.963790 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.964523 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4824604f-7b99-455c-be80-b8410dc47264-scripts\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.967436 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.981180 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srdbb\" (UniqueName: \"kubernetes.io/projected/4824604f-7b99-455c-be80-b8410dc47264-kube-api-access-srdbb\") pod \"cinder-backup-0\" (UID: \"4824604f-7b99-455c-be80-b8410dc47264\") " pod="openstack/cinder-backup-0" Dec 04 16:09:22 crc kubenswrapper[4676]: I1204 16:09:22.989667 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049466 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049774 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-run\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049808 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw57n\" (UniqueName: \"kubernetes.io/projected/9572f37c-801d-4ea4-acfe-4ad3be15946a-kube-api-access-jw57n\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049827 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049846 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-dev\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049863 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049885 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049899 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049940 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf72q\" (UniqueName: \"kubernetes.io/projected/2edf87ae-1216-4015-9a84-9db0c05f045e-kube-api-access-pf72q\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049971 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-sys\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050009 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050033 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050054 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050074 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050089 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050105 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050128 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050150 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050172 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050200 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050273 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050300 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050330 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050353 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050374 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050393 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050429 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050448 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050472 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050502 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.049628 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.050948 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051007 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051323 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051417 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051390 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051550 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051349 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051697 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.051739 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9572f37c-801d-4ea4-acfe-4ad3be15946a-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.055096 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.055159 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.055343 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.056331 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9572f37c-801d-4ea4-acfe-4ad3be15946a-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.068574 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw57n\" (UniqueName: \"kubernetes.io/projected/9572f37c-801d-4ea4-acfe-4ad3be15946a-kube-api-access-jw57n\") pod \"cinder-volume-nfs-2-0\" (UID: \"9572f37c-801d-4ea4-acfe-4ad3be15946a\") " pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.136397 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153452 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153518 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153563 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153618 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153643 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153683 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153732 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-run\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153781 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-dev\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153808 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153833 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153854 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153884 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf72q\" (UniqueName: \"kubernetes.io/projected/2edf87ae-1216-4015-9a84-9db0c05f045e-kube-api-access-pf72q\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.153970 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-sys\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154025 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154080 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154298 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154355 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154396 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154580 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154638 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154647 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154677 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-run\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154710 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-dev\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154731 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-sys\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.154938 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2edf87ae-1216-4015-9a84-9db0c05f045e-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.159415 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.160137 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.160854 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.167224 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2edf87ae-1216-4015-9a84-9db0c05f045e-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.175470 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf72q\" (UniqueName: \"kubernetes.io/projected/2edf87ae-1216-4015-9a84-9db0c05f045e-kube-api-access-pf72q\") pod \"cinder-volume-nfs-0\" (UID: \"2edf87ae-1216-4015-9a84-9db0c05f045e\") " pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.223279 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.229363 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.394803 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:09:23 crc kubenswrapper[4676]: E1204 16:09:23.396349 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:09:23 crc kubenswrapper[4676]: I1204 16:09:23.969775 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 04 16:09:23 crc kubenswrapper[4676]: W1204 16:09:23.975435 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4824604f_7b99_455c_be80_b8410dc47264.slice/crio-66dccfa9fb2e7b99616371a66fd59ad9a496cfdab17f01c79f8f61a18483dc75 WatchSource:0}: Error finding container 66dccfa9fb2e7b99616371a66fd59ad9a496cfdab17f01c79f8f61a18483dc75: Status 404 returned error can't find the container with id 66dccfa9fb2e7b99616371a66fd59ad9a496cfdab17f01c79f8f61a18483dc75 Dec 04 16:09:24 crc kubenswrapper[4676]: I1204 16:09:24.067523 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Dec 04 16:09:24 crc kubenswrapper[4676]: W1204 16:09:24.175098 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2edf87ae_1216_4015_9a84_9db0c05f045e.slice/crio-6403bb6ae626fddd36ebad16933a850e607b9d154bf6e137b5618635e109e794 WatchSource:0}: Error finding container 6403bb6ae626fddd36ebad16933a850e607b9d154bf6e137b5618635e109e794: Status 404 returned error can't find the container with id 6403bb6ae626fddd36ebad16933a850e607b9d154bf6e137b5618635e109e794 Dec 04 16:09:24 crc kubenswrapper[4676]: I1204 16:09:24.180211 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Dec 04 16:09:24 crc kubenswrapper[4676]: I1204 16:09:24.875234 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"9572f37c-801d-4ea4-acfe-4ad3be15946a","Type":"ContainerStarted","Data":"1a1377455cf633eba00dd08c6f3ede0e8b247b2b6f32fea04734f36650f1582a"} Dec 04 16:09:24 crc kubenswrapper[4676]: I1204 16:09:24.876693 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"2edf87ae-1216-4015-9a84-9db0c05f045e","Type":"ContainerStarted","Data":"6403bb6ae626fddd36ebad16933a850e607b9d154bf6e137b5618635e109e794"} Dec 04 16:09:24 crc kubenswrapper[4676]: I1204 16:09:24.878813 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4824604f-7b99-455c-be80-b8410dc47264","Type":"ContainerStarted","Data":"66dccfa9fb2e7b99616371a66fd59ad9a496cfdab17f01c79f8f61a18483dc75"} Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.890924 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"9572f37c-801d-4ea4-acfe-4ad3be15946a","Type":"ContainerStarted","Data":"dac67629bd60d5702079aa9a5f44ae596caf77efcd6980942607ea9005fe3661"} Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.891848 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"9572f37c-801d-4ea4-acfe-4ad3be15946a","Type":"ContainerStarted","Data":"8f79598402616fd40fb4660c783a3f71b2cecb00ddd0ff40f504f0a7870eebd1"} Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.893365 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"2edf87ae-1216-4015-9a84-9db0c05f045e","Type":"ContainerStarted","Data":"28b56747332a523a2097a4b3a8414b448c07a4ceff92101211bca5353aa88ce8"} Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.893392 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"2edf87ae-1216-4015-9a84-9db0c05f045e","Type":"ContainerStarted","Data":"b0f55a2d3585172d1980f28eafee35e146b0ddb9bc627a778c9f75a90b5c214a"} Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.897211 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4824604f-7b99-455c-be80-b8410dc47264","Type":"ContainerStarted","Data":"1e720c23b6b5fcd77109171f861fb4df2fa7de21cd8d9589db224d4f4b0b9783"} Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.897262 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4824604f-7b99-455c-be80-b8410dc47264","Type":"ContainerStarted","Data":"f026b0239251e36b5b2ae5f470ffe33e7c21943df320fa874952043bad556e69"} Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.974768 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=3.200480381 podStartE2EDuration="3.974733138s" podCreationTimestamp="2025-12-04 16:09:22 +0000 UTC" firstStartedPulling="2025-12-04 16:09:24.060185557 +0000 UTC m=+2971.494855414" lastFinishedPulling="2025-12-04 16:09:24.834438314 +0000 UTC m=+2972.269108171" observedRunningTime="2025-12-04 16:09:25.93548792 +0000 UTC m=+2973.370157787" watchObservedRunningTime="2025-12-04 16:09:25.974733138 +0000 UTC m=+2973.409402995" Dec 04 16:09:25 crc kubenswrapper[4676]: I1204 16:09:25.975803 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.201234162 podStartE2EDuration="3.975795039s" podCreationTimestamp="2025-12-04 16:09:22 +0000 UTC" firstStartedPulling="2025-12-04 16:09:23.978047244 +0000 UTC m=+2971.412717101" lastFinishedPulling="2025-12-04 16:09:24.752608121 +0000 UTC m=+2972.187277978" observedRunningTime="2025-12-04 16:09:25.973278516 +0000 UTC m=+2973.407948393" watchObservedRunningTime="2025-12-04 16:09:25.975795039 +0000 UTC m=+2973.410464896" Dec 04 16:09:26 crc kubenswrapper[4676]: I1204 16:09:26.001704 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=3.3450815990000002 podStartE2EDuration="4.001687653s" podCreationTimestamp="2025-12-04 16:09:22 +0000 UTC" firstStartedPulling="2025-12-04 16:09:24.176868322 +0000 UTC m=+2971.611538169" lastFinishedPulling="2025-12-04 16:09:24.833474356 +0000 UTC m=+2972.268144223" observedRunningTime="2025-12-04 16:09:25.994846337 +0000 UTC m=+2973.429516194" watchObservedRunningTime="2025-12-04 16:09:26.001687653 +0000 UTC m=+2973.436357500" Dec 04 16:09:28 crc kubenswrapper[4676]: I1204 16:09:28.136473 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 04 16:09:28 crc kubenswrapper[4676]: I1204 16:09:28.224735 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:28 crc kubenswrapper[4676]: I1204 16:09:28.230773 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:33 crc kubenswrapper[4676]: I1204 16:09:33.454154 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Dec 04 16:09:33 crc kubenswrapper[4676]: I1204 16:09:33.526126 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 04 16:09:33 crc kubenswrapper[4676]: I1204 16:09:33.544156 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Dec 04 16:09:34 crc kubenswrapper[4676]: I1204 16:09:34.384223 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:09:34 crc kubenswrapper[4676]: E1204 16:09:34.384616 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:09:46 crc kubenswrapper[4676]: I1204 16:09:46.384222 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:09:46 crc kubenswrapper[4676]: E1204 16:09:46.385130 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:10:00 crc kubenswrapper[4676]: I1204 16:10:00.385061 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:10:00 crc kubenswrapper[4676]: E1204 16:10:00.385776 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:10:13 crc kubenswrapper[4676]: I1204 16:10:13.390730 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:10:13 crc kubenswrapper[4676]: E1204 16:10:13.391517 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:10:25 crc kubenswrapper[4676]: I1204 16:10:25.590058 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 16:10:25 crc kubenswrapper[4676]: I1204 16:10:25.590727 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="prometheus" containerID="cri-o://991a9a33e94f251f9ea53fb45351db6e413bd376ad4e2f9824b66c19f1bf3920" gracePeriod=600 Dec 04 16:10:25 crc kubenswrapper[4676]: I1204 16:10:25.590852 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="config-reloader" containerID="cri-o://23cf260c28249d11f44440cb43d81694bb3dfa95a535fac9169e3ef103394bce" gracePeriod=600 Dec 04 16:10:25 crc kubenswrapper[4676]: I1204 16:10:25.592593 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="thanos-sidecar" containerID="cri-o://a91e1a0554e9ce02fae7e7976179b0a8d76dbbdd0d92ebe87923262e15de4c5a" gracePeriod=600 Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.384984 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:10:26 crc kubenswrapper[4676]: E1204 16:10:26.385968 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.567298 4676 generic.go:334] "Generic (PLEG): container finished" podID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerID="a91e1a0554e9ce02fae7e7976179b0a8d76dbbdd0d92ebe87923262e15de4c5a" exitCode=0 Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.567340 4676 generic.go:334] "Generic (PLEG): container finished" podID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerID="23cf260c28249d11f44440cb43d81694bb3dfa95a535fac9169e3ef103394bce" exitCode=0 Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.567349 4676 generic.go:334] "Generic (PLEG): container finished" podID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerID="991a9a33e94f251f9ea53fb45351db6e413bd376ad4e2f9824b66c19f1bf3920" exitCode=0 Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.567369 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerDied","Data":"a91e1a0554e9ce02fae7e7976179b0a8d76dbbdd0d92ebe87923262e15de4c5a"} Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.567433 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerDied","Data":"23cf260c28249d11f44440cb43d81694bb3dfa95a535fac9169e3ef103394bce"} Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.567452 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerDied","Data":"991a9a33e94f251f9ea53fb45351db6e413bd376ad4e2f9824b66c19f1bf3920"} Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.732616 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.867591 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.867657 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-thanos-prometheus-http-client-file\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.867709 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.867872 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-prometheus-metric-storage-rulefiles-0\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.867934 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.867956 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.867992 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-secret-combined-ca-bundle\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.868039 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.868057 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scn6r\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-kube-api-access-scn6r\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.868099 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-tls-assets\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.868143 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config-out\") pod \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\" (UID: \"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966\") " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.868811 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.874197 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.874207 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.874721 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config" (OuterVolumeSpecName: "config") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.876394 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config-out" (OuterVolumeSpecName: "config-out") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.877276 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.877874 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.880258 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-kube-api-access-scn6r" (OuterVolumeSpecName: "kube-api-access-scn6r") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "kube-api-access-scn6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.881048 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.890155 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.967429 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config" (OuterVolumeSpecName: "web-config") pod "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" (UID: "0affe6f6-46dd-4d5c-8ec7-2c1ad220a966"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972645 4676 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972672 4676 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972684 4676 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972694 4676 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972705 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scn6r\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-kube-api-access-scn6r\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972717 4676 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972724 4676 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config-out\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972766 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") on node \"crc\" " Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972792 4676 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972806 4676 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-config\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:26 crc kubenswrapper[4676]: I1204 16:10:26.972818 4676 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.008115 4676 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.008287 4676 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515") on node "crc" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.076656 4676 reconciler_common.go:293] "Volume detached for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") on node \"crc\" DevicePath \"\"" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.592776 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0affe6f6-46dd-4d5c-8ec7-2c1ad220a966","Type":"ContainerDied","Data":"3912ccd57b14bef539a7dcb43fff8badc45eb6dbb5e7b86d3a062206bf974983"} Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.592859 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.593086 4676 scope.go:117] "RemoveContainer" containerID="a91e1a0554e9ce02fae7e7976179b0a8d76dbbdd0d92ebe87923262e15de4c5a" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.637137 4676 scope.go:117] "RemoveContainer" containerID="23cf260c28249d11f44440cb43d81694bb3dfa95a535fac9169e3ef103394bce" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.665795 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.685032 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.699429 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.704186 4676 scope.go:117] "RemoveContainer" containerID="991a9a33e94f251f9ea53fb45351db6e413bd376ad4e2f9824b66c19f1bf3920" Dec 04 16:10:27 crc kubenswrapper[4676]: E1204 16:10:27.718617 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="config-reloader" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.718669 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="config-reloader" Dec 04 16:10:27 crc kubenswrapper[4676]: E1204 16:10:27.718704 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="init-config-reloader" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.718714 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="init-config-reloader" Dec 04 16:10:27 crc kubenswrapper[4676]: E1204 16:10:27.718743 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="thanos-sidecar" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.718752 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="thanos-sidecar" Dec 04 16:10:27 crc kubenswrapper[4676]: E1204 16:10:27.718779 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="prometheus" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.718787 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="prometheus" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.719097 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="thanos-sidecar" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.719140 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="prometheus" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.719155 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="config-reloader" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.722024 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.726719 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.729950 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.730084 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-dsbwb" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.730162 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.730261 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.730303 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.740620 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.772548 4676 scope.go:117] "RemoveContainer" containerID="3208384ccdbd564f3e354d6c6164a6970015b7dc9c6d09643c22c9f914c60fe8" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.904866 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6970c56-0104-45cf-a58e-91be763b6054-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.904927 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.904966 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.904994 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.905217 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66c5j\" (UniqueName: \"kubernetes.io/projected/f6970c56-0104-45cf-a58e-91be763b6054-kube-api-access-66c5j\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.905302 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-config\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.905425 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6970c56-0104-45cf-a58e-91be763b6054-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.905469 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.905564 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6970c56-0104-45cf-a58e-91be763b6054-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.905594 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:27 crc kubenswrapper[4676]: I1204 16:10:27.905747 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.007967 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6970c56-0104-45cf-a58e-91be763b6054-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008020 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008090 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008218 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6970c56-0104-45cf-a58e-91be763b6054-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008244 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008278 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008315 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008353 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-config\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008374 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66c5j\" (UniqueName: \"kubernetes.io/projected/f6970c56-0104-45cf-a58e-91be763b6054-kube-api-access-66c5j\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008435 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6970c56-0104-45cf-a58e-91be763b6054-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.008468 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.010195 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6970c56-0104-45cf-a58e-91be763b6054-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.012891 4676 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.012972 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/20a8147025daa03f462937d002ea44fbf472037636c1db1460079ca29c39445e/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.013061 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-config\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.014358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.014610 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.015097 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.015336 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6970c56-0104-45cf-a58e-91be763b6054-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.015461 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.018796 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6970c56-0104-45cf-a58e-91be763b6054-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.034016 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6970c56-0104-45cf-a58e-91be763b6054-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.041380 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66c5j\" (UniqueName: \"kubernetes.io/projected/f6970c56-0104-45cf-a58e-91be763b6054-kube-api-access-66c5j\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.084024 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bdc7c64c-8182-4f81-bf2b-9e110f1dd515\") pod \"prometheus-metric-storage-0\" (UID: \"f6970c56-0104-45cf-a58e-91be763b6054\") " pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.362693 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:28 crc kubenswrapper[4676]: I1204 16:10:28.837137 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 04 16:10:29 crc kubenswrapper[4676]: I1204 16:10:29.396054 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" path="/var/lib/kubelet/pods/0affe6f6-46dd-4d5c-8ec7-2c1ad220a966/volumes" Dec 04 16:10:29 crc kubenswrapper[4676]: I1204 16:10:29.490495 4676 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="0affe6f6-46dd-4d5c-8ec7-2c1ad220a966" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.131:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 04 16:10:29 crc kubenswrapper[4676]: I1204 16:10:29.613972 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f6970c56-0104-45cf-a58e-91be763b6054","Type":"ContainerStarted","Data":"4a902ef1a08dd1c26e879f53140abdee129f28d120c385e8e3ccf17a835df546"} Dec 04 16:10:32 crc kubenswrapper[4676]: I1204 16:10:32.645570 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f6970c56-0104-45cf-a58e-91be763b6054","Type":"ContainerStarted","Data":"c5268373c351bbf60a34061644ab86a840ebe38cad7477121e112c914dae5ca6"} Dec 04 16:10:39 crc kubenswrapper[4676]: I1204 16:10:39.406567 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:10:39 crc kubenswrapper[4676]: E1204 16:10:39.416458 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:10:40 crc kubenswrapper[4676]: I1204 16:10:40.806996 4676 generic.go:334] "Generic (PLEG): container finished" podID="f6970c56-0104-45cf-a58e-91be763b6054" containerID="c5268373c351bbf60a34061644ab86a840ebe38cad7477121e112c914dae5ca6" exitCode=0 Dec 04 16:10:40 crc kubenswrapper[4676]: I1204 16:10:40.807247 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f6970c56-0104-45cf-a58e-91be763b6054","Type":"ContainerDied","Data":"c5268373c351bbf60a34061644ab86a840ebe38cad7477121e112c914dae5ca6"} Dec 04 16:10:41 crc kubenswrapper[4676]: I1204 16:10:41.818811 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f6970c56-0104-45cf-a58e-91be763b6054","Type":"ContainerStarted","Data":"64626965a35af2b55f12e23e7197ac8a98f09fdd4285568ba4b65a4e1fb39b93"} Dec 04 16:10:44 crc kubenswrapper[4676]: I1204 16:10:44.958196 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f6970c56-0104-45cf-a58e-91be763b6054","Type":"ContainerStarted","Data":"0f72aedb5d3b4e39d4442cb34676439e880ba1abf4c7f1a968ad84fae9bae594"} Dec 04 16:10:45 crc kubenswrapper[4676]: I1204 16:10:45.973032 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"f6970c56-0104-45cf-a58e-91be763b6054","Type":"ContainerStarted","Data":"8e86282bed3e8db02563c3cfae761eaecb34e56a348088876d7510ae409b29f9"} Dec 04 16:10:46 crc kubenswrapper[4676]: I1204 16:10:46.012171 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.01215192 podStartE2EDuration="19.01215192s" podCreationTimestamp="2025-12-04 16:10:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 16:10:46.011584284 +0000 UTC m=+3053.446254151" watchObservedRunningTime="2025-12-04 16:10:46.01215192 +0000 UTC m=+3053.446821777" Dec 04 16:10:48 crc kubenswrapper[4676]: I1204 16:10:48.363313 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:54 crc kubenswrapper[4676]: I1204 16:10:54.384664 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:10:54 crc kubenswrapper[4676]: E1204 16:10:54.386349 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:10:58 crc kubenswrapper[4676]: I1204 16:10:58.363544 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:58 crc kubenswrapper[4676]: I1204 16:10:58.370248 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 04 16:10:59 crc kubenswrapper[4676]: I1204 16:10:59.094628 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 04 16:11:02 crc kubenswrapper[4676]: I1204 16:11:02.270160 4676 scope.go:117] "RemoveContainer" containerID="29388a76b5c189ba03a6cc1a551442a614545805169e69b03404cf8311df29aa" Dec 04 16:11:02 crc kubenswrapper[4676]: I1204 16:11:02.294311 4676 scope.go:117] "RemoveContainer" containerID="135bb0f07305ad19e631f25c9ac0c9993a62bd020ac5ad0c609afb13f448a280" Dec 04 16:11:02 crc kubenswrapper[4676]: I1204 16:11:02.325149 4676 scope.go:117] "RemoveContainer" containerID="c3f0f94b08f87ed5e3e56ec5ad1f70e6f13c4506f39c5e5068081cfbff7902cb" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.133081 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.135609 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.141374 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.141973 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.142174 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-4zwdj" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.144085 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.150089 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.178372 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.178551 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-config-data\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.179034 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281302 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281395 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281476 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281501 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281533 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281561 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281590 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281659 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k85c\" (UniqueName: \"kubernetes.io/projected/1728d401-fbd4-470d-8084-deaa0ca6c1b5-kube-api-access-8k85c\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.281715 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-config-data\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.283032 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.283375 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-config-data\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.291938 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.384440 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k85c\" (UniqueName: \"kubernetes.io/projected/1728d401-fbd4-470d-8084-deaa0ca6c1b5-kube-api-access-8k85c\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.384636 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.384990 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:11:05 crc kubenswrapper[4676]: E1204 16:11:05.385310 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.385528 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.385545 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.386081 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.386237 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.386372 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.386702 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.387238 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.390062 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.395632 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.410248 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k85c\" (UniqueName: \"kubernetes.io/projected/1728d401-fbd4-470d-8084-deaa0ca6c1b5-kube-api-access-8k85c\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.427138 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " pod="openstack/tempest-tests-tempest" Dec 04 16:11:05 crc kubenswrapper[4676]: I1204 16:11:05.459876 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 16:11:06 crc kubenswrapper[4676]: I1204 16:11:06.032124 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 04 16:11:06 crc kubenswrapper[4676]: I1204 16:11:06.162384 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1728d401-fbd4-470d-8084-deaa0ca6c1b5","Type":"ContainerStarted","Data":"6a3fec7f331a1db3dcb21161964bcf4dd921596dc4781869fc43f6490acf3d00"} Dec 04 16:11:19 crc kubenswrapper[4676]: I1204 16:11:19.292345 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1728d401-fbd4-470d-8084-deaa0ca6c1b5","Type":"ContainerStarted","Data":"44d20814f0884951383435f51f06966d06d34ad548dc9f3c9cc5a8921d0de952"} Dec 04 16:11:19 crc kubenswrapper[4676]: I1204 16:11:19.313441 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=2.960522003 podStartE2EDuration="15.313421291s" podCreationTimestamp="2025-12-04 16:11:04 +0000 UTC" firstStartedPulling="2025-12-04 16:11:06.036621422 +0000 UTC m=+3073.471291279" lastFinishedPulling="2025-12-04 16:11:18.38952071 +0000 UTC m=+3085.824190567" observedRunningTime="2025-12-04 16:11:19.310610931 +0000 UTC m=+3086.745280788" watchObservedRunningTime="2025-12-04 16:11:19.313421291 +0000 UTC m=+3086.748091168" Dec 04 16:11:20 crc kubenswrapper[4676]: I1204 16:11:20.384365 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:11:20 crc kubenswrapper[4676]: E1204 16:11:20.385743 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:11:31 crc kubenswrapper[4676]: I1204 16:11:31.387405 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:11:31 crc kubenswrapper[4676]: E1204 16:11:31.388469 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:11:42 crc kubenswrapper[4676]: I1204 16:11:42.385255 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:11:42 crc kubenswrapper[4676]: E1204 16:11:42.386441 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:11:53 crc kubenswrapper[4676]: I1204 16:11:53.391407 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:11:53 crc kubenswrapper[4676]: E1204 16:11:53.392317 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:12:07 crc kubenswrapper[4676]: I1204 16:12:07.384286 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:12:07 crc kubenswrapper[4676]: E1204 16:12:07.385068 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:12:21 crc kubenswrapper[4676]: I1204 16:12:21.385375 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:12:21 crc kubenswrapper[4676]: E1204 16:12:21.387361 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:12:36 crc kubenswrapper[4676]: I1204 16:12:36.385326 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:12:36 crc kubenswrapper[4676]: E1204 16:12:36.386127 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:12:47 crc kubenswrapper[4676]: I1204 16:12:47.384713 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:12:47 crc kubenswrapper[4676]: E1204 16:12:47.385664 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:13:02 crc kubenswrapper[4676]: I1204 16:13:02.384586 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:13:02 crc kubenswrapper[4676]: E1204 16:13:02.385549 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:13:14 crc kubenswrapper[4676]: I1204 16:13:14.448571 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:13:14 crc kubenswrapper[4676]: E1204 16:13:14.449969 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:13:29 crc kubenswrapper[4676]: I1204 16:13:29.385012 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:13:30 crc kubenswrapper[4676]: I1204 16:13:30.437160 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"324b4d71b7a5c8456a57733048183b3190856bfe21dd034d34acdf0a96c9ae42"} Dec 04 16:14:02 crc kubenswrapper[4676]: I1204 16:14:02.508763 4676 scope.go:117] "RemoveContainer" containerID="8b91020abe4ae4ea0ca2ad41f69d737ef5b18e7b1d73edc711174159bff13424" Dec 04 16:14:02 crc kubenswrapper[4676]: I1204 16:14:02.539585 4676 scope.go:117] "RemoveContainer" containerID="2afc409358cfee0898c8ac2bd47a2104f82d9d6db8c4c3dff03758b954ca24eb" Dec 04 16:14:02 crc kubenswrapper[4676]: I1204 16:14:02.594795 4676 scope.go:117] "RemoveContainer" containerID="4d4f6e9c6a466186a989101b3403a40d4afcbdd3e84effc36af36f46f6002c8d" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.568690 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9r4w8"] Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.571634 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.601683 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9r4w8"] Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.681717 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-utilities\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.682140 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-catalog-content\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.682214 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvmbs\" (UniqueName: \"kubernetes.io/projected/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-kube-api-access-hvmbs\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.784359 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-utilities\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.784724 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-catalog-content\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.784832 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvmbs\" (UniqueName: \"kubernetes.io/projected/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-kube-api-access-hvmbs\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.784894 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-utilities\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.785170 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-catalog-content\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.814799 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvmbs\" (UniqueName: \"kubernetes.io/projected/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-kube-api-access-hvmbs\") pod \"redhat-operators-9r4w8\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:27 crc kubenswrapper[4676]: I1204 16:14:27.895394 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:28 crc kubenswrapper[4676]: I1204 16:14:28.557390 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9r4w8"] Dec 04 16:14:29 crc kubenswrapper[4676]: I1204 16:14:29.502118 4676 generic.go:334] "Generic (PLEG): container finished" podID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerID="8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f" exitCode=0 Dec 04 16:14:29 crc kubenswrapper[4676]: I1204 16:14:29.502187 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r4w8" event={"ID":"2e08f093-5c4f-42cf-9f70-3d22bef4e45b","Type":"ContainerDied","Data":"8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f"} Dec 04 16:14:29 crc kubenswrapper[4676]: I1204 16:14:29.502607 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r4w8" event={"ID":"2e08f093-5c4f-42cf-9f70-3d22bef4e45b","Type":"ContainerStarted","Data":"1a19de2ebfb2095aac63f0dc7e5f8dfe8b048e54598510a89271758d44fe9b0a"} Dec 04 16:14:29 crc kubenswrapper[4676]: I1204 16:14:29.504735 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:14:30 crc kubenswrapper[4676]: I1204 16:14:30.514017 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r4w8" event={"ID":"2e08f093-5c4f-42cf-9f70-3d22bef4e45b","Type":"ContainerStarted","Data":"3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb"} Dec 04 16:14:34 crc kubenswrapper[4676]: I1204 16:14:34.550621 4676 generic.go:334] "Generic (PLEG): container finished" podID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerID="3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb" exitCode=0 Dec 04 16:14:34 crc kubenswrapper[4676]: I1204 16:14:34.550717 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r4w8" event={"ID":"2e08f093-5c4f-42cf-9f70-3d22bef4e45b","Type":"ContainerDied","Data":"3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb"} Dec 04 16:14:35 crc kubenswrapper[4676]: I1204 16:14:35.563899 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r4w8" event={"ID":"2e08f093-5c4f-42cf-9f70-3d22bef4e45b","Type":"ContainerStarted","Data":"8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d"} Dec 04 16:14:35 crc kubenswrapper[4676]: I1204 16:14:35.585779 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9r4w8" podStartSLOduration=3.12331908 podStartE2EDuration="8.585759189s" podCreationTimestamp="2025-12-04 16:14:27 +0000 UTC" firstStartedPulling="2025-12-04 16:14:29.504377869 +0000 UTC m=+3276.939047726" lastFinishedPulling="2025-12-04 16:14:34.966817978 +0000 UTC m=+3282.401487835" observedRunningTime="2025-12-04 16:14:35.580742474 +0000 UTC m=+3283.015412341" watchObservedRunningTime="2025-12-04 16:14:35.585759189 +0000 UTC m=+3283.020429046" Dec 04 16:14:37 crc kubenswrapper[4676]: I1204 16:14:37.895653 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:37 crc kubenswrapper[4676]: I1204 16:14:37.895992 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:38 crc kubenswrapper[4676]: I1204 16:14:38.944450 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9r4w8" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="registry-server" probeResult="failure" output=< Dec 04 16:14:38 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 16:14:38 crc kubenswrapper[4676]: > Dec 04 16:14:47 crc kubenswrapper[4676]: I1204 16:14:47.945928 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:48 crc kubenswrapper[4676]: I1204 16:14:48.002400 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:48 crc kubenswrapper[4676]: I1204 16:14:48.191647 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9r4w8"] Dec 04 16:14:49 crc kubenswrapper[4676]: I1204 16:14:49.691457 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9r4w8" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="registry-server" containerID="cri-o://8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d" gracePeriod=2 Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.457663 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.615789 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvmbs\" (UniqueName: \"kubernetes.io/projected/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-kube-api-access-hvmbs\") pod \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.616089 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-catalog-content\") pod \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.616312 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-utilities\") pod \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\" (UID: \"2e08f093-5c4f-42cf-9f70-3d22bef4e45b\") " Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.617714 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-utilities" (OuterVolumeSpecName: "utilities") pod "2e08f093-5c4f-42cf-9f70-3d22bef4e45b" (UID: "2e08f093-5c4f-42cf-9f70-3d22bef4e45b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.624348 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-kube-api-access-hvmbs" (OuterVolumeSpecName: "kube-api-access-hvmbs") pod "2e08f093-5c4f-42cf-9f70-3d22bef4e45b" (UID: "2e08f093-5c4f-42cf-9f70-3d22bef4e45b"). InnerVolumeSpecName "kube-api-access-hvmbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.708680 4676 generic.go:334] "Generic (PLEG): container finished" podID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerID="8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d" exitCode=0 Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.708731 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r4w8" event={"ID":"2e08f093-5c4f-42cf-9f70-3d22bef4e45b","Type":"ContainerDied","Data":"8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d"} Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.708734 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9r4w8" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.708764 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9r4w8" event={"ID":"2e08f093-5c4f-42cf-9f70-3d22bef4e45b","Type":"ContainerDied","Data":"1a19de2ebfb2095aac63f0dc7e5f8dfe8b048e54598510a89271758d44fe9b0a"} Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.708813 4676 scope.go:117] "RemoveContainer" containerID="8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.726365 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.726399 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvmbs\" (UniqueName: \"kubernetes.io/projected/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-kube-api-access-hvmbs\") on node \"crc\" DevicePath \"\"" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.726515 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e08f093-5c4f-42cf-9f70-3d22bef4e45b" (UID: "2e08f093-5c4f-42cf-9f70-3d22bef4e45b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.739152 4676 scope.go:117] "RemoveContainer" containerID="3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.768251 4676 scope.go:117] "RemoveContainer" containerID="8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.823614 4676 scope.go:117] "RemoveContainer" containerID="8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d" Dec 04 16:14:50 crc kubenswrapper[4676]: E1204 16:14:50.824586 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d\": container with ID starting with 8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d not found: ID does not exist" containerID="8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.824622 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d"} err="failed to get container status \"8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d\": rpc error: code = NotFound desc = could not find container \"8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d\": container with ID starting with 8ab2c065ac27f7611fa38240d56c84c7c8d72ee193c1d0f5e385b7e71426626d not found: ID does not exist" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.824644 4676 scope.go:117] "RemoveContainer" containerID="3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb" Dec 04 16:14:50 crc kubenswrapper[4676]: E1204 16:14:50.824961 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb\": container with ID starting with 3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb not found: ID does not exist" containerID="3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.825000 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb"} err="failed to get container status \"3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb\": rpc error: code = NotFound desc = could not find container \"3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb\": container with ID starting with 3b7c1932cbbe422a27123473f403adfa886f13edaa542e98f984ad0c0919eabb not found: ID does not exist" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.825035 4676 scope.go:117] "RemoveContainer" containerID="8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f" Dec 04 16:14:50 crc kubenswrapper[4676]: E1204 16:14:50.825338 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f\": container with ID starting with 8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f not found: ID does not exist" containerID="8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.825361 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f"} err="failed to get container status \"8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f\": rpc error: code = NotFound desc = could not find container \"8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f\": container with ID starting with 8015e981c58cc275ec2ba265a3d8644a867cf535e9a707151367f4b49008df6f not found: ID does not exist" Dec 04 16:14:50 crc kubenswrapper[4676]: I1204 16:14:50.827784 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e08f093-5c4f-42cf-9f70-3d22bef4e45b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:14:51 crc kubenswrapper[4676]: I1204 16:14:51.084088 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9r4w8"] Dec 04 16:14:51 crc kubenswrapper[4676]: I1204 16:14:51.093414 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9r4w8"] Dec 04 16:14:51 crc kubenswrapper[4676]: I1204 16:14:51.398801 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" path="/var/lib/kubelet/pods/2e08f093-5c4f-42cf-9f70-3d22bef4e45b/volumes" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.157290 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz"] Dec 04 16:15:00 crc kubenswrapper[4676]: E1204 16:15:00.158227 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="registry-server" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.158249 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="registry-server" Dec 04 16:15:00 crc kubenswrapper[4676]: E1204 16:15:00.158265 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="extract-utilities" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.158271 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="extract-utilities" Dec 04 16:15:00 crc kubenswrapper[4676]: E1204 16:15:00.158292 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="extract-content" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.158298 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="extract-content" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.158516 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e08f093-5c4f-42cf-9f70-3d22bef4e45b" containerName="registry-server" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.159336 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.161478 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.162541 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.171039 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz"] Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.299108 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-secret-volume\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.299537 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6mc8\" (UniqueName: \"kubernetes.io/projected/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-kube-api-access-l6mc8\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.300180 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-config-volume\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.402536 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6mc8\" (UniqueName: \"kubernetes.io/projected/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-kube-api-access-l6mc8\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.402682 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-config-volume\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.402752 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-secret-volume\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.404143 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-config-volume\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.420968 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-secret-volume\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.424722 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6mc8\" (UniqueName: \"kubernetes.io/projected/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-kube-api-access-l6mc8\") pod \"collect-profiles-29414415-98jrz\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:00 crc kubenswrapper[4676]: I1204 16:15:00.492502 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:01 crc kubenswrapper[4676]: I1204 16:15:00.994252 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz"] Dec 04 16:15:01 crc kubenswrapper[4676]: I1204 16:15:01.973917 4676 generic.go:334] "Generic (PLEG): container finished" podID="b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" containerID="50f9b1e03d8d94f70b8d649008570ece80b7625e773edfd50995b3c35a19dd70" exitCode=0 Dec 04 16:15:01 crc kubenswrapper[4676]: I1204 16:15:01.974068 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" event={"ID":"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0","Type":"ContainerDied","Data":"50f9b1e03d8d94f70b8d649008570ece80b7625e773edfd50995b3c35a19dd70"} Dec 04 16:15:01 crc kubenswrapper[4676]: I1204 16:15:01.974397 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" event={"ID":"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0","Type":"ContainerStarted","Data":"dcd6998e29683089a4fab9071c5446f33728ea50fc6986ee70eff279af6d6d28"} Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.356405 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.511519 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-config-volume\") pod \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.511606 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6mc8\" (UniqueName: \"kubernetes.io/projected/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-kube-api-access-l6mc8\") pod \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.511731 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-secret-volume\") pod \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\" (UID: \"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0\") " Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.512067 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" (UID: "b9026fa1-14f7-4dfe-90bd-c8fb160f18a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.512554 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.525280 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-kube-api-access-l6mc8" (OuterVolumeSpecName: "kube-api-access-l6mc8") pod "b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" (UID: "b9026fa1-14f7-4dfe-90bd-c8fb160f18a0"). InnerVolumeSpecName "kube-api-access-l6mc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.528869 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" (UID: "b9026fa1-14f7-4dfe-90bd-c8fb160f18a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.614149 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.614352 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6mc8\" (UniqueName: \"kubernetes.io/projected/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0-kube-api-access-l6mc8\") on node \"crc\" DevicePath \"\"" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.995309 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" event={"ID":"b9026fa1-14f7-4dfe-90bd-c8fb160f18a0","Type":"ContainerDied","Data":"dcd6998e29683089a4fab9071c5446f33728ea50fc6986ee70eff279af6d6d28"} Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.995641 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd6998e29683089a4fab9071c5446f33728ea50fc6986ee70eff279af6d6d28" Dec 04 16:15:03 crc kubenswrapper[4676]: I1204 16:15:03.995396 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz" Dec 04 16:15:04 crc kubenswrapper[4676]: I1204 16:15:04.431498 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r"] Dec 04 16:15:04 crc kubenswrapper[4676]: I1204 16:15:04.441414 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414370-pfx7r"] Dec 04 16:15:05 crc kubenswrapper[4676]: I1204 16:15:05.401846 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b70fed03-9c92-403c-9f63-732c2aeb0fd6" path="/var/lib/kubelet/pods/b70fed03-9c92-403c-9f63-732c2aeb0fd6/volumes" Dec 04 16:15:46 crc kubenswrapper[4676]: I1204 16:15:46.027224 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:15:46 crc kubenswrapper[4676]: I1204 16:15:46.027854 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:16:02 crc kubenswrapper[4676]: I1204 16:16:02.726040 4676 scope.go:117] "RemoveContainer" containerID="8b22d30475a9fd360280e23a4d36e904846b1d07e6dd241e5c093771aef99b6d" Dec 04 16:16:16 crc kubenswrapper[4676]: I1204 16:16:16.027199 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:16:16 crc kubenswrapper[4676]: I1204 16:16:16.027738 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.027513 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.028118 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.028169 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.029084 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"324b4d71b7a5c8456a57733048183b3190856bfe21dd034d34acdf0a96c9ae42"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.029145 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://324b4d71b7a5c8456a57733048183b3190856bfe21dd034d34acdf0a96c9ae42" gracePeriod=600 Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.169977 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="324b4d71b7a5c8456a57733048183b3190856bfe21dd034d34acdf0a96c9ae42" exitCode=0 Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.170011 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"324b4d71b7a5c8456a57733048183b3190856bfe21dd034d34acdf0a96c9ae42"} Dec 04 16:16:46 crc kubenswrapper[4676]: I1204 16:16:46.170096 4676 scope.go:117] "RemoveContainer" containerID="56ebbebe155d9fe45d7801a188e2ef52f4efdc44def04e05ffd4ab60632b58f5" Dec 04 16:16:47 crc kubenswrapper[4676]: I1204 16:16:47.182214 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3"} Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.626252 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x94nb"] Dec 04 16:18:07 crc kubenswrapper[4676]: E1204 16:18:07.627339 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" containerName="collect-profiles" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.627364 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" containerName="collect-profiles" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.627647 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" containerName="collect-profiles" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.629716 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.638557 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x94nb"] Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.718713 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-utilities\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.719329 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76g4\" (UniqueName: \"kubernetes.io/projected/dd673451-4120-459c-ab5e-49532de8a6ce-kube-api-access-k76g4\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.719541 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-catalog-content\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.821937 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-utilities\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.822138 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76g4\" (UniqueName: \"kubernetes.io/projected/dd673451-4120-459c-ab5e-49532de8a6ce-kube-api-access-k76g4\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.822301 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-catalog-content\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.822730 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-utilities\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.822729 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-catalog-content\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.844497 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76g4\" (UniqueName: \"kubernetes.io/projected/dd673451-4120-459c-ab5e-49532de8a6ce-kube-api-access-k76g4\") pod \"certified-operators-x94nb\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:07 crc kubenswrapper[4676]: I1204 16:18:07.951615 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:08 crc kubenswrapper[4676]: I1204 16:18:08.496637 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x94nb"] Dec 04 16:18:08 crc kubenswrapper[4676]: I1204 16:18:08.600188 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94nb" event={"ID":"dd673451-4120-459c-ab5e-49532de8a6ce","Type":"ContainerStarted","Data":"43879f5fe65dec197d404518a180fee185f67b8a61b3be98a949f12eef0b3d53"} Dec 04 16:18:09 crc kubenswrapper[4676]: I1204 16:18:09.612358 4676 generic.go:334] "Generic (PLEG): container finished" podID="dd673451-4120-459c-ab5e-49532de8a6ce" containerID="b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2" exitCode=0 Dec 04 16:18:09 crc kubenswrapper[4676]: I1204 16:18:09.612447 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94nb" event={"ID":"dd673451-4120-459c-ab5e-49532de8a6ce","Type":"ContainerDied","Data":"b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2"} Dec 04 16:18:10 crc kubenswrapper[4676]: I1204 16:18:10.625479 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94nb" event={"ID":"dd673451-4120-459c-ab5e-49532de8a6ce","Type":"ContainerStarted","Data":"aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32"} Dec 04 16:18:12 crc kubenswrapper[4676]: I1204 16:18:12.648824 4676 generic.go:334] "Generic (PLEG): container finished" podID="dd673451-4120-459c-ab5e-49532de8a6ce" containerID="aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32" exitCode=0 Dec 04 16:18:12 crc kubenswrapper[4676]: I1204 16:18:12.648898 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94nb" event={"ID":"dd673451-4120-459c-ab5e-49532de8a6ce","Type":"ContainerDied","Data":"aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32"} Dec 04 16:18:13 crc kubenswrapper[4676]: I1204 16:18:13.663507 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94nb" event={"ID":"dd673451-4120-459c-ab5e-49532de8a6ce","Type":"ContainerStarted","Data":"e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881"} Dec 04 16:18:13 crc kubenswrapper[4676]: I1204 16:18:13.691503 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x94nb" podStartSLOduration=3.225624282 podStartE2EDuration="6.691486918s" podCreationTimestamp="2025-12-04 16:18:07 +0000 UTC" firstStartedPulling="2025-12-04 16:18:09.614769282 +0000 UTC m=+3497.049439149" lastFinishedPulling="2025-12-04 16:18:13.080631928 +0000 UTC m=+3500.515301785" observedRunningTime="2025-12-04 16:18:13.684980843 +0000 UTC m=+3501.119650700" watchObservedRunningTime="2025-12-04 16:18:13.691486918 +0000 UTC m=+3501.126156775" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.209721 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4q9gk"] Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.212595 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.478723 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-catalog-content\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.478841 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkjw6\" (UniqueName: \"kubernetes.io/projected/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-kube-api-access-nkjw6\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.478882 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-utilities\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.496064 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4q9gk"] Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.580654 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-catalog-content\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.580801 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkjw6\" (UniqueName: \"kubernetes.io/projected/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-kube-api-access-nkjw6\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.580854 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-utilities\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.581249 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-catalog-content\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.581377 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-utilities\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.612313 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkjw6\" (UniqueName: \"kubernetes.io/projected/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-kube-api-access-nkjw6\") pod \"community-operators-4q9gk\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:16 crc kubenswrapper[4676]: I1204 16:18:16.812749 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:17 crc kubenswrapper[4676]: I1204 16:18:17.418751 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4q9gk"] Dec 04 16:18:17 crc kubenswrapper[4676]: W1204 16:18:17.424994 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfa0212a_3a2e_4af4_9841_01e05e39a0eb.slice/crio-e68268522939b1b9ace587f0874b8163688341b1b56a1116fee9caf7878cb845 WatchSource:0}: Error finding container e68268522939b1b9ace587f0874b8163688341b1b56a1116fee9caf7878cb845: Status 404 returned error can't find the container with id e68268522939b1b9ace587f0874b8163688341b1b56a1116fee9caf7878cb845 Dec 04 16:18:17 crc kubenswrapper[4676]: I1204 16:18:17.708731 4676 generic.go:334] "Generic (PLEG): container finished" podID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerID="f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04" exitCode=0 Dec 04 16:18:17 crc kubenswrapper[4676]: I1204 16:18:17.708777 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4q9gk" event={"ID":"dfa0212a-3a2e-4af4-9841-01e05e39a0eb","Type":"ContainerDied","Data":"f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04"} Dec 04 16:18:17 crc kubenswrapper[4676]: I1204 16:18:17.708805 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4q9gk" event={"ID":"dfa0212a-3a2e-4af4-9841-01e05e39a0eb","Type":"ContainerStarted","Data":"e68268522939b1b9ace587f0874b8163688341b1b56a1116fee9caf7878cb845"} Dec 04 16:18:17 crc kubenswrapper[4676]: I1204 16:18:17.952247 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:17 crc kubenswrapper[4676]: I1204 16:18:17.952576 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:18 crc kubenswrapper[4676]: I1204 16:18:18.002157 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:18 crc kubenswrapper[4676]: I1204 16:18:18.719074 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4q9gk" event={"ID":"dfa0212a-3a2e-4af4-9841-01e05e39a0eb","Type":"ContainerStarted","Data":"b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553"} Dec 04 16:18:18 crc kubenswrapper[4676]: I1204 16:18:18.776219 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:19 crc kubenswrapper[4676]: I1204 16:18:19.732109 4676 generic.go:334] "Generic (PLEG): container finished" podID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerID="b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553" exitCode=0 Dec 04 16:18:19 crc kubenswrapper[4676]: I1204 16:18:19.732158 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4q9gk" event={"ID":"dfa0212a-3a2e-4af4-9841-01e05e39a0eb","Type":"ContainerDied","Data":"b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553"} Dec 04 16:18:20 crc kubenswrapper[4676]: I1204 16:18:20.004767 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x94nb"] Dec 04 16:18:20 crc kubenswrapper[4676]: I1204 16:18:20.744888 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4q9gk" event={"ID":"dfa0212a-3a2e-4af4-9841-01e05e39a0eb","Type":"ContainerStarted","Data":"75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28"} Dec 04 16:18:20 crc kubenswrapper[4676]: I1204 16:18:20.771274 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4q9gk" podStartSLOduration=2.120167632 podStartE2EDuration="4.771251762s" podCreationTimestamp="2025-12-04 16:18:16 +0000 UTC" firstStartedPulling="2025-12-04 16:18:17.711058765 +0000 UTC m=+3505.145728622" lastFinishedPulling="2025-12-04 16:18:20.362142905 +0000 UTC m=+3507.796812752" observedRunningTime="2025-12-04 16:18:20.764196201 +0000 UTC m=+3508.198866108" watchObservedRunningTime="2025-12-04 16:18:20.771251762 +0000 UTC m=+3508.205921619" Dec 04 16:18:21 crc kubenswrapper[4676]: I1204 16:18:21.755723 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x94nb" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="registry-server" containerID="cri-o://e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881" gracePeriod=2 Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.277306 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.430169 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k76g4\" (UniqueName: \"kubernetes.io/projected/dd673451-4120-459c-ab5e-49532de8a6ce-kube-api-access-k76g4\") pod \"dd673451-4120-459c-ab5e-49532de8a6ce\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.430546 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-utilities\") pod \"dd673451-4120-459c-ab5e-49532de8a6ce\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.430586 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-catalog-content\") pod \"dd673451-4120-459c-ab5e-49532de8a6ce\" (UID: \"dd673451-4120-459c-ab5e-49532de8a6ce\") " Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.431435 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-utilities" (OuterVolumeSpecName: "utilities") pod "dd673451-4120-459c-ab5e-49532de8a6ce" (UID: "dd673451-4120-459c-ab5e-49532de8a6ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.437483 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd673451-4120-459c-ab5e-49532de8a6ce-kube-api-access-k76g4" (OuterVolumeSpecName: "kube-api-access-k76g4") pod "dd673451-4120-459c-ab5e-49532de8a6ce" (UID: "dd673451-4120-459c-ab5e-49532de8a6ce"). InnerVolumeSpecName "kube-api-access-k76g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.505781 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd673451-4120-459c-ab5e-49532de8a6ce" (UID: "dd673451-4120-459c-ab5e-49532de8a6ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.533411 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.533444 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd673451-4120-459c-ab5e-49532de8a6ce-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.533458 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k76g4\" (UniqueName: \"kubernetes.io/projected/dd673451-4120-459c-ab5e-49532de8a6ce-kube-api-access-k76g4\") on node \"crc\" DevicePath \"\"" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.766643 4676 generic.go:334] "Generic (PLEG): container finished" podID="dd673451-4120-459c-ab5e-49532de8a6ce" containerID="e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881" exitCode=0 Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.766695 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94nb" event={"ID":"dd673451-4120-459c-ab5e-49532de8a6ce","Type":"ContainerDied","Data":"e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881"} Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.766756 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x94nb" event={"ID":"dd673451-4120-459c-ab5e-49532de8a6ce","Type":"ContainerDied","Data":"43879f5fe65dec197d404518a180fee185f67b8a61b3be98a949f12eef0b3d53"} Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.766781 4676 scope.go:117] "RemoveContainer" containerID="e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.766939 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x94nb" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.801861 4676 scope.go:117] "RemoveContainer" containerID="aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.803464 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x94nb"] Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.811940 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x94nb"] Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.834374 4676 scope.go:117] "RemoveContainer" containerID="b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.883040 4676 scope.go:117] "RemoveContainer" containerID="e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881" Dec 04 16:18:22 crc kubenswrapper[4676]: E1204 16:18:22.883712 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881\": container with ID starting with e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881 not found: ID does not exist" containerID="e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.883761 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881"} err="failed to get container status \"e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881\": rpc error: code = NotFound desc = could not find container \"e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881\": container with ID starting with e2030fbc1abcebb0a7bc65e6319288edb11661675f08f12b5a9f618327d46881 not found: ID does not exist" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.883792 4676 scope.go:117] "RemoveContainer" containerID="aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32" Dec 04 16:18:22 crc kubenswrapper[4676]: E1204 16:18:22.888445 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32\": container with ID starting with aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32 not found: ID does not exist" containerID="aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.888486 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32"} err="failed to get container status \"aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32\": rpc error: code = NotFound desc = could not find container \"aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32\": container with ID starting with aee78e8e22a4db87a17f90ba0a28484a603d22e075db88e41aafd6f1c7ccac32 not found: ID does not exist" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.888513 4676 scope.go:117] "RemoveContainer" containerID="b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2" Dec 04 16:18:22 crc kubenswrapper[4676]: E1204 16:18:22.888833 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2\": container with ID starting with b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2 not found: ID does not exist" containerID="b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2" Dec 04 16:18:22 crc kubenswrapper[4676]: I1204 16:18:22.888892 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2"} err="failed to get container status \"b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2\": rpc error: code = NotFound desc = could not find container \"b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2\": container with ID starting with b634c17b5366eb28c5a50f1f434bb258371318d2bd7b58f93ad3ca7717c9b7c2 not found: ID does not exist" Dec 04 16:18:23 crc kubenswrapper[4676]: I1204 16:18:23.399368 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" path="/var/lib/kubelet/pods/dd673451-4120-459c-ab5e-49532de8a6ce/volumes" Dec 04 16:18:26 crc kubenswrapper[4676]: I1204 16:18:26.814016 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:26 crc kubenswrapper[4676]: I1204 16:18:26.814359 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:26 crc kubenswrapper[4676]: I1204 16:18:26.965164 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:27 crc kubenswrapper[4676]: I1204 16:18:27.896582 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:27 crc kubenswrapper[4676]: I1204 16:18:27.959303 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4q9gk"] Dec 04 16:18:29 crc kubenswrapper[4676]: I1204 16:18:29.862197 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4q9gk" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="registry-server" containerID="cri-o://75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28" gracePeriod=2 Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.362625 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.519175 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-catalog-content\") pod \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.519330 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkjw6\" (UniqueName: \"kubernetes.io/projected/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-kube-api-access-nkjw6\") pod \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.519515 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-utilities\") pod \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\" (UID: \"dfa0212a-3a2e-4af4-9841-01e05e39a0eb\") " Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.520361 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-utilities" (OuterVolumeSpecName: "utilities") pod "dfa0212a-3a2e-4af4-9841-01e05e39a0eb" (UID: "dfa0212a-3a2e-4af4-9841-01e05e39a0eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.525150 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-kube-api-access-nkjw6" (OuterVolumeSpecName: "kube-api-access-nkjw6") pod "dfa0212a-3a2e-4af4-9841-01e05e39a0eb" (UID: "dfa0212a-3a2e-4af4-9841-01e05e39a0eb"). InnerVolumeSpecName "kube-api-access-nkjw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.578388 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dfa0212a-3a2e-4af4-9841-01e05e39a0eb" (UID: "dfa0212a-3a2e-4af4-9841-01e05e39a0eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.622619 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.622659 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.622673 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkjw6\" (UniqueName: \"kubernetes.io/projected/dfa0212a-3a2e-4af4-9841-01e05e39a0eb-kube-api-access-nkjw6\") on node \"crc\" DevicePath \"\"" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.873419 4676 generic.go:334] "Generic (PLEG): container finished" podID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerID="75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28" exitCode=0 Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.873461 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4q9gk" event={"ID":"dfa0212a-3a2e-4af4-9841-01e05e39a0eb","Type":"ContainerDied","Data":"75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28"} Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.873486 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4q9gk" event={"ID":"dfa0212a-3a2e-4af4-9841-01e05e39a0eb","Type":"ContainerDied","Data":"e68268522939b1b9ace587f0874b8163688341b1b56a1116fee9caf7878cb845"} Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.873493 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4q9gk" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.873502 4676 scope.go:117] "RemoveContainer" containerID="75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.905557 4676 scope.go:117] "RemoveContainer" containerID="b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.921776 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4q9gk"] Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.929558 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4q9gk"] Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.934663 4676 scope.go:117] "RemoveContainer" containerID="f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.988619 4676 scope.go:117] "RemoveContainer" containerID="75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28" Dec 04 16:18:30 crc kubenswrapper[4676]: E1204 16:18:30.989135 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28\": container with ID starting with 75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28 not found: ID does not exist" containerID="75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.989187 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28"} err="failed to get container status \"75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28\": rpc error: code = NotFound desc = could not find container \"75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28\": container with ID starting with 75bd884f6d536a5e3c10b867092b86a2bc023943ccd809e2f2cdbeb75edc3f28 not found: ID does not exist" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.989208 4676 scope.go:117] "RemoveContainer" containerID="b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553" Dec 04 16:18:30 crc kubenswrapper[4676]: E1204 16:18:30.989560 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553\": container with ID starting with b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553 not found: ID does not exist" containerID="b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.989641 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553"} err="failed to get container status \"b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553\": rpc error: code = NotFound desc = could not find container \"b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553\": container with ID starting with b98ce8e1c55418cd4596d350829bbfa5dad622c11d748349818259e20f254553 not found: ID does not exist" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.989706 4676 scope.go:117] "RemoveContainer" containerID="f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04" Dec 04 16:18:30 crc kubenswrapper[4676]: E1204 16:18:30.991145 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04\": container with ID starting with f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04 not found: ID does not exist" containerID="f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04" Dec 04 16:18:30 crc kubenswrapper[4676]: I1204 16:18:30.991225 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04"} err="failed to get container status \"f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04\": rpc error: code = NotFound desc = could not find container \"f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04\": container with ID starting with f11ef42a947aa335cb02b23f4725f12c3183d161865591db5dbd37413b150d04 not found: ID does not exist" Dec 04 16:18:31 crc kubenswrapper[4676]: I1204 16:18:31.397207 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" path="/var/lib/kubelet/pods/dfa0212a-3a2e-4af4-9841-01e05e39a0eb/volumes" Dec 04 16:18:46 crc kubenswrapper[4676]: I1204 16:18:46.026579 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:18:46 crc kubenswrapper[4676]: I1204 16:18:46.027205 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:19:16 crc kubenswrapper[4676]: I1204 16:19:16.027208 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:19:16 crc kubenswrapper[4676]: I1204 16:19:16.027885 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.027240 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.028190 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.028278 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.029296 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.029391 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" gracePeriod=600 Dec 04 16:19:46 crc kubenswrapper[4676]: E1204 16:19:46.154090 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.756315 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" exitCode=0 Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.756435 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3"} Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.756678 4676 scope.go:117] "RemoveContainer" containerID="324b4d71b7a5c8456a57733048183b3190856bfe21dd034d34acdf0a96c9ae42" Dec 04 16:19:46 crc kubenswrapper[4676]: I1204 16:19:46.757401 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:19:46 crc kubenswrapper[4676]: E1204 16:19:46.757729 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:20:00 crc kubenswrapper[4676]: I1204 16:20:00.385284 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:20:00 crc kubenswrapper[4676]: E1204 16:20:00.386391 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:20:11 crc kubenswrapper[4676]: I1204 16:20:11.384434 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:20:11 crc kubenswrapper[4676]: E1204 16:20:11.385534 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:20:25 crc kubenswrapper[4676]: I1204 16:20:25.385292 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:20:25 crc kubenswrapper[4676]: E1204 16:20:25.386207 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:20:25 crc kubenswrapper[4676]: E1204 16:20:25.913951 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 04 16:20:38 crc kubenswrapper[4676]: I1204 16:20:38.385041 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:20:38 crc kubenswrapper[4676]: E1204 16:20:38.385871 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:20:51 crc kubenswrapper[4676]: I1204 16:20:51.385099 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:20:51 crc kubenswrapper[4676]: E1204 16:20:51.385994 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:21:03 crc kubenswrapper[4676]: I1204 16:21:03.393165 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:21:03 crc kubenswrapper[4676]: E1204 16:21:03.394226 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:21:15 crc kubenswrapper[4676]: I1204 16:21:15.385335 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:21:15 crc kubenswrapper[4676]: E1204 16:21:15.387204 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.074757 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rn8wh"] Dec 04 16:21:23 crc kubenswrapper[4676]: E1204 16:21:23.075942 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="registry-server" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.075971 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="registry-server" Dec 04 16:21:23 crc kubenswrapper[4676]: E1204 16:21:23.075994 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="extract-content" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.076003 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="extract-content" Dec 04 16:21:23 crc kubenswrapper[4676]: E1204 16:21:23.076028 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="registry-server" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.076036 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="registry-server" Dec 04 16:21:23 crc kubenswrapper[4676]: E1204 16:21:23.076051 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="extract-utilities" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.076059 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="extract-utilities" Dec 04 16:21:23 crc kubenswrapper[4676]: E1204 16:21:23.076076 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="extract-content" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.076083 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="extract-content" Dec 04 16:21:23 crc kubenswrapper[4676]: E1204 16:21:23.076113 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="extract-utilities" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.076121 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="extract-utilities" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.076379 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfa0212a-3a2e-4af4-9841-01e05e39a0eb" containerName="registry-server" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.076397 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd673451-4120-459c-ab5e-49532de8a6ce" containerName="registry-server" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.078089 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.090353 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn8wh"] Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.182517 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-catalog-content\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.182586 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-utilities\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.182772 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgmkm\" (UniqueName: \"kubernetes.io/projected/fe37d77f-bc68-481b-83cc-c8558e4e8367-kube-api-access-hgmkm\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.285321 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgmkm\" (UniqueName: \"kubernetes.io/projected/fe37d77f-bc68-481b-83cc-c8558e4e8367-kube-api-access-hgmkm\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.285453 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-catalog-content\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.285495 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-utilities\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.286000 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-utilities\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.286149 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-catalog-content\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.307838 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgmkm\" (UniqueName: \"kubernetes.io/projected/fe37d77f-bc68-481b-83cc-c8558e4e8367-kube-api-access-hgmkm\") pod \"redhat-marketplace-rn8wh\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.397073 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:23 crc kubenswrapper[4676]: I1204 16:21:23.920037 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn8wh"] Dec 04 16:21:24 crc kubenswrapper[4676]: I1204 16:21:24.051606 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn8wh" event={"ID":"fe37d77f-bc68-481b-83cc-c8558e4e8367","Type":"ContainerStarted","Data":"8310ec90011e3ce82afd0e160fec7404671c0e4fac713dfa08a0b810e5c4d410"} Dec 04 16:21:25 crc kubenswrapper[4676]: I1204 16:21:25.063203 4676 generic.go:334] "Generic (PLEG): container finished" podID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerID="fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e" exitCode=0 Dec 04 16:21:25 crc kubenswrapper[4676]: I1204 16:21:25.063244 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn8wh" event={"ID":"fe37d77f-bc68-481b-83cc-c8558e4e8367","Type":"ContainerDied","Data":"fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e"} Dec 04 16:21:25 crc kubenswrapper[4676]: I1204 16:21:25.065583 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:21:26 crc kubenswrapper[4676]: I1204 16:21:26.076276 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn8wh" event={"ID":"fe37d77f-bc68-481b-83cc-c8558e4e8367","Type":"ContainerStarted","Data":"49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597"} Dec 04 16:21:26 crc kubenswrapper[4676]: I1204 16:21:26.385078 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:21:26 crc kubenswrapper[4676]: E1204 16:21:26.385373 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:21:27 crc kubenswrapper[4676]: I1204 16:21:27.089782 4676 generic.go:334] "Generic (PLEG): container finished" podID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerID="49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597" exitCode=0 Dec 04 16:21:27 crc kubenswrapper[4676]: I1204 16:21:27.090244 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn8wh" event={"ID":"fe37d77f-bc68-481b-83cc-c8558e4e8367","Type":"ContainerDied","Data":"49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597"} Dec 04 16:21:28 crc kubenswrapper[4676]: I1204 16:21:28.102260 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn8wh" event={"ID":"fe37d77f-bc68-481b-83cc-c8558e4e8367","Type":"ContainerStarted","Data":"6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be"} Dec 04 16:21:28 crc kubenswrapper[4676]: I1204 16:21:28.128594 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rn8wh" podStartSLOduration=2.708550802 podStartE2EDuration="5.128578025s" podCreationTimestamp="2025-12-04 16:21:23 +0000 UTC" firstStartedPulling="2025-12-04 16:21:25.065324859 +0000 UTC m=+3692.499994716" lastFinishedPulling="2025-12-04 16:21:27.485352082 +0000 UTC m=+3694.920021939" observedRunningTime="2025-12-04 16:21:28.123568102 +0000 UTC m=+3695.558237959" watchObservedRunningTime="2025-12-04 16:21:28.128578025 +0000 UTC m=+3695.563247882" Dec 04 16:21:33 crc kubenswrapper[4676]: I1204 16:21:33.398148 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:33 crc kubenswrapper[4676]: I1204 16:21:33.398790 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:33 crc kubenswrapper[4676]: I1204 16:21:33.443857 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:34 crc kubenswrapper[4676]: I1204 16:21:34.230677 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:36 crc kubenswrapper[4676]: I1204 16:21:36.410022 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn8wh"] Dec 04 16:21:36 crc kubenswrapper[4676]: I1204 16:21:36.410522 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rn8wh" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="registry-server" containerID="cri-o://6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be" gracePeriod=2 Dec 04 16:21:36 crc kubenswrapper[4676]: I1204 16:21:36.929356 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.014401 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-utilities\") pod \"fe37d77f-bc68-481b-83cc-c8558e4e8367\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.014571 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-catalog-content\") pod \"fe37d77f-bc68-481b-83cc-c8558e4e8367\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.014680 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgmkm\" (UniqueName: \"kubernetes.io/projected/fe37d77f-bc68-481b-83cc-c8558e4e8367-kube-api-access-hgmkm\") pod \"fe37d77f-bc68-481b-83cc-c8558e4e8367\" (UID: \"fe37d77f-bc68-481b-83cc-c8558e4e8367\") " Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.015506 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-utilities" (OuterVolumeSpecName: "utilities") pod "fe37d77f-bc68-481b-83cc-c8558e4e8367" (UID: "fe37d77f-bc68-481b-83cc-c8558e4e8367"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.016077 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.025667 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe37d77f-bc68-481b-83cc-c8558e4e8367-kube-api-access-hgmkm" (OuterVolumeSpecName: "kube-api-access-hgmkm") pod "fe37d77f-bc68-481b-83cc-c8558e4e8367" (UID: "fe37d77f-bc68-481b-83cc-c8558e4e8367"). InnerVolumeSpecName "kube-api-access-hgmkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.043353 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe37d77f-bc68-481b-83cc-c8558e4e8367" (UID: "fe37d77f-bc68-481b-83cc-c8558e4e8367"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.118084 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe37d77f-bc68-481b-83cc-c8558e4e8367-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.118131 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgmkm\" (UniqueName: \"kubernetes.io/projected/fe37d77f-bc68-481b-83cc-c8558e4e8367-kube-api-access-hgmkm\") on node \"crc\" DevicePath \"\"" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.420372 4676 generic.go:334] "Generic (PLEG): container finished" podID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerID="6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be" exitCode=0 Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.420432 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn8wh" event={"ID":"fe37d77f-bc68-481b-83cc-c8558e4e8367","Type":"ContainerDied","Data":"6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be"} Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.420462 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rn8wh" event={"ID":"fe37d77f-bc68-481b-83cc-c8558e4e8367","Type":"ContainerDied","Data":"8310ec90011e3ce82afd0e160fec7404671c0e4fac713dfa08a0b810e5c4d410"} Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.420481 4676 scope.go:117] "RemoveContainer" containerID="6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.420607 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rn8wh" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.452346 4676 scope.go:117] "RemoveContainer" containerID="49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.459043 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn8wh"] Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.469449 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rn8wh"] Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.483706 4676 scope.go:117] "RemoveContainer" containerID="fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.535710 4676 scope.go:117] "RemoveContainer" containerID="6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be" Dec 04 16:21:37 crc kubenswrapper[4676]: E1204 16:21:37.536288 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be\": container with ID starting with 6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be not found: ID does not exist" containerID="6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.536390 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be"} err="failed to get container status \"6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be\": rpc error: code = NotFound desc = could not find container \"6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be\": container with ID starting with 6135d1dee67674303704fc3853cf27ae3036361a605f1a695e4b2c7f92b1b6be not found: ID does not exist" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.536475 4676 scope.go:117] "RemoveContainer" containerID="49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597" Dec 04 16:21:37 crc kubenswrapper[4676]: E1204 16:21:37.536997 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597\": container with ID starting with 49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597 not found: ID does not exist" containerID="49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.537043 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597"} err="failed to get container status \"49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597\": rpc error: code = NotFound desc = could not find container \"49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597\": container with ID starting with 49a4478dbd6a555de55d36732a2243ec7bb6c1c3d5eae232f4370fb17b38d597 not found: ID does not exist" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.537072 4676 scope.go:117] "RemoveContainer" containerID="fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e" Dec 04 16:21:37 crc kubenswrapper[4676]: E1204 16:21:37.537424 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e\": container with ID starting with fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e not found: ID does not exist" containerID="fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e" Dec 04 16:21:37 crc kubenswrapper[4676]: I1204 16:21:37.537501 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e"} err="failed to get container status \"fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e\": rpc error: code = NotFound desc = could not find container \"fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e\": container with ID starting with fb1ea019a687fcd4476c262eec5a327931124adce1d7b27343f796aa68dfc58e not found: ID does not exist" Dec 04 16:21:39 crc kubenswrapper[4676]: I1204 16:21:39.399145 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" path="/var/lib/kubelet/pods/fe37d77f-bc68-481b-83cc-c8558e4e8367/volumes" Dec 04 16:21:41 crc kubenswrapper[4676]: I1204 16:21:41.385405 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:21:41 crc kubenswrapper[4676]: E1204 16:21:41.385978 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:21:55 crc kubenswrapper[4676]: I1204 16:21:55.385101 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:21:55 crc kubenswrapper[4676]: E1204 16:21:55.385942 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:22:09 crc kubenswrapper[4676]: I1204 16:22:09.385458 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:22:09 crc kubenswrapper[4676]: E1204 16:22:09.386320 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:22:21 crc kubenswrapper[4676]: I1204 16:22:21.384315 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:22:21 crc kubenswrapper[4676]: E1204 16:22:21.385170 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:22:34 crc kubenswrapper[4676]: I1204 16:22:34.385269 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:22:34 crc kubenswrapper[4676]: E1204 16:22:34.386318 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:22:48 crc kubenswrapper[4676]: I1204 16:22:48.385934 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:22:48 crc kubenswrapper[4676]: E1204 16:22:48.386637 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:23:03 crc kubenswrapper[4676]: I1204 16:23:03.392880 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:23:03 crc kubenswrapper[4676]: E1204 16:23:03.393716 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:23:15 crc kubenswrapper[4676]: I1204 16:23:15.384745 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:23:15 crc kubenswrapper[4676]: E1204 16:23:15.385574 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:23:28 crc kubenswrapper[4676]: I1204 16:23:28.384650 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:23:28 crc kubenswrapper[4676]: E1204 16:23:28.385580 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:23:40 crc kubenswrapper[4676]: I1204 16:23:40.547327 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:23:40 crc kubenswrapper[4676]: E1204 16:23:40.548421 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:23:55 crc kubenswrapper[4676]: I1204 16:23:55.384353 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:23:55 crc kubenswrapper[4676]: E1204 16:23:55.385038 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:24:10 crc kubenswrapper[4676]: I1204 16:24:10.384653 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:24:10 crc kubenswrapper[4676]: E1204 16:24:10.385410 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:24:23 crc kubenswrapper[4676]: I1204 16:24:23.392205 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:24:23 crc kubenswrapper[4676]: E1204 16:24:23.393176 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:24:38 crc kubenswrapper[4676]: I1204 16:24:38.384695 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:24:38 crc kubenswrapper[4676]: E1204 16:24:38.385503 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:24:50 crc kubenswrapper[4676]: I1204 16:24:50.396237 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:24:51 crc kubenswrapper[4676]: I1204 16:24:51.599098 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"a27219e82cb5df25ee12c4a70a158ce63b00fc2e23d5223df55724721043c2d8"} Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.815529 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gqgcz"] Dec 04 16:25:45 crc kubenswrapper[4676]: E1204 16:25:45.816665 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="extract-utilities" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.816696 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="extract-utilities" Dec 04 16:25:45 crc kubenswrapper[4676]: E1204 16:25:45.816716 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="extract-content" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.816725 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="extract-content" Dec 04 16:25:45 crc kubenswrapper[4676]: E1204 16:25:45.816762 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="registry-server" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.816771 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="registry-server" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.817061 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe37d77f-bc68-481b-83cc-c8558e4e8367" containerName="registry-server" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.819510 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.836795 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqgcz"] Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.957311 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4hqz\" (UniqueName: \"kubernetes.io/projected/37c7931a-8949-40c7-8f67-c357a097ed3a-kube-api-access-d4hqz\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.957446 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-utilities\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:45 crc kubenswrapper[4676]: I1204 16:25:45.957738 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-catalog-content\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.059431 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4hqz\" (UniqueName: \"kubernetes.io/projected/37c7931a-8949-40c7-8f67-c357a097ed3a-kube-api-access-d4hqz\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.059585 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-utilities\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.059726 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-catalog-content\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.060359 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-catalog-content\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.060358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-utilities\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.105189 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4hqz\" (UniqueName: \"kubernetes.io/projected/37c7931a-8949-40c7-8f67-c357a097ed3a-kube-api-access-d4hqz\") pod \"redhat-operators-gqgcz\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.144125 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:46 crc kubenswrapper[4676]: I1204 16:25:46.816636 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gqgcz"] Dec 04 16:25:47 crc kubenswrapper[4676]: I1204 16:25:47.306915 4676 generic.go:334] "Generic (PLEG): container finished" podID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerID="0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4" exitCode=0 Dec 04 16:25:47 crc kubenswrapper[4676]: I1204 16:25:47.306969 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqgcz" event={"ID":"37c7931a-8949-40c7-8f67-c357a097ed3a","Type":"ContainerDied","Data":"0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4"} Dec 04 16:25:47 crc kubenswrapper[4676]: I1204 16:25:47.307001 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqgcz" event={"ID":"37c7931a-8949-40c7-8f67-c357a097ed3a","Type":"ContainerStarted","Data":"2102df82e80af906bc1371e6de66edcd5030f8f32c557eea21513c00425ac590"} Dec 04 16:25:48 crc kubenswrapper[4676]: I1204 16:25:48.320240 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqgcz" event={"ID":"37c7931a-8949-40c7-8f67-c357a097ed3a","Type":"ContainerStarted","Data":"9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108"} Dec 04 16:25:51 crc kubenswrapper[4676]: I1204 16:25:51.357194 4676 generic.go:334] "Generic (PLEG): container finished" podID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerID="9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108" exitCode=0 Dec 04 16:25:51 crc kubenswrapper[4676]: I1204 16:25:51.357272 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqgcz" event={"ID":"37c7931a-8949-40c7-8f67-c357a097ed3a","Type":"ContainerDied","Data":"9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108"} Dec 04 16:25:52 crc kubenswrapper[4676]: I1204 16:25:52.371352 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqgcz" event={"ID":"37c7931a-8949-40c7-8f67-c357a097ed3a","Type":"ContainerStarted","Data":"ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387"} Dec 04 16:25:52 crc kubenswrapper[4676]: I1204 16:25:52.403946 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gqgcz" podStartSLOduration=2.882270268 podStartE2EDuration="7.403910122s" podCreationTimestamp="2025-12-04 16:25:45 +0000 UTC" firstStartedPulling="2025-12-04 16:25:47.309432962 +0000 UTC m=+3954.744102809" lastFinishedPulling="2025-12-04 16:25:51.831072806 +0000 UTC m=+3959.265742663" observedRunningTime="2025-12-04 16:25:52.394669068 +0000 UTC m=+3959.829338925" watchObservedRunningTime="2025-12-04 16:25:52.403910122 +0000 UTC m=+3959.838579979" Dec 04 16:25:56 crc kubenswrapper[4676]: I1204 16:25:56.144237 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:56 crc kubenswrapper[4676]: I1204 16:25:56.145251 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:25:57 crc kubenswrapper[4676]: I1204 16:25:57.198695 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gqgcz" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="registry-server" probeResult="failure" output=< Dec 04 16:25:57 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 16:25:57 crc kubenswrapper[4676]: > Dec 04 16:26:06 crc kubenswrapper[4676]: I1204 16:26:06.213680 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:26:06 crc kubenswrapper[4676]: I1204 16:26:06.270241 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:26:06 crc kubenswrapper[4676]: I1204 16:26:06.458995 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqgcz"] Dec 04 16:26:07 crc kubenswrapper[4676]: I1204 16:26:07.526213 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gqgcz" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="registry-server" containerID="cri-o://ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387" gracePeriod=2 Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.185704 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.276548 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4hqz\" (UniqueName: \"kubernetes.io/projected/37c7931a-8949-40c7-8f67-c357a097ed3a-kube-api-access-d4hqz\") pod \"37c7931a-8949-40c7-8f67-c357a097ed3a\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.276763 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-catalog-content\") pod \"37c7931a-8949-40c7-8f67-c357a097ed3a\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.276843 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-utilities\") pod \"37c7931a-8949-40c7-8f67-c357a097ed3a\" (UID: \"37c7931a-8949-40c7-8f67-c357a097ed3a\") " Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.277874 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-utilities" (OuterVolumeSpecName: "utilities") pod "37c7931a-8949-40c7-8f67-c357a097ed3a" (UID: "37c7931a-8949-40c7-8f67-c357a097ed3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.283182 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c7931a-8949-40c7-8f67-c357a097ed3a-kube-api-access-d4hqz" (OuterVolumeSpecName: "kube-api-access-d4hqz") pod "37c7931a-8949-40c7-8f67-c357a097ed3a" (UID: "37c7931a-8949-40c7-8f67-c357a097ed3a"). InnerVolumeSpecName "kube-api-access-d4hqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.380793 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.380827 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4hqz\" (UniqueName: \"kubernetes.io/projected/37c7931a-8949-40c7-8f67-c357a097ed3a-kube-api-access-d4hqz\") on node \"crc\" DevicePath \"\"" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.413232 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37c7931a-8949-40c7-8f67-c357a097ed3a" (UID: "37c7931a-8949-40c7-8f67-c357a097ed3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.482692 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37c7931a-8949-40c7-8f67-c357a097ed3a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.540739 4676 generic.go:334] "Generic (PLEG): container finished" podID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerID="ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387" exitCode=0 Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.540794 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqgcz" event={"ID":"37c7931a-8949-40c7-8f67-c357a097ed3a","Type":"ContainerDied","Data":"ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387"} Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.540835 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gqgcz" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.540871 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gqgcz" event={"ID":"37c7931a-8949-40c7-8f67-c357a097ed3a","Type":"ContainerDied","Data":"2102df82e80af906bc1371e6de66edcd5030f8f32c557eea21513c00425ac590"} Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.540891 4676 scope.go:117] "RemoveContainer" containerID="ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.569975 4676 scope.go:117] "RemoveContainer" containerID="9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.587981 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gqgcz"] Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.599884 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gqgcz"] Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.608060 4676 scope.go:117] "RemoveContainer" containerID="0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.646343 4676 scope.go:117] "RemoveContainer" containerID="ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387" Dec 04 16:26:08 crc kubenswrapper[4676]: E1204 16:26:08.646820 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387\": container with ID starting with ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387 not found: ID does not exist" containerID="ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.646870 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387"} err="failed to get container status \"ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387\": rpc error: code = NotFound desc = could not find container \"ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387\": container with ID starting with ef8665a885353d37ac864bf509cf63e7292dd96f1eebdf980087d16118484387 not found: ID does not exist" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.646913 4676 scope.go:117] "RemoveContainer" containerID="9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108" Dec 04 16:26:08 crc kubenswrapper[4676]: E1204 16:26:08.647222 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108\": container with ID starting with 9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108 not found: ID does not exist" containerID="9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.647244 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108"} err="failed to get container status \"9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108\": rpc error: code = NotFound desc = could not find container \"9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108\": container with ID starting with 9f3170b8d8b61e189b64aa065a3490481db0ed8ce771ebe85f86e6b4c8ea9108 not found: ID does not exist" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.647258 4676 scope.go:117] "RemoveContainer" containerID="0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4" Dec 04 16:26:08 crc kubenswrapper[4676]: E1204 16:26:08.647472 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4\": container with ID starting with 0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4 not found: ID does not exist" containerID="0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4" Dec 04 16:26:08 crc kubenswrapper[4676]: I1204 16:26:08.647498 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4"} err="failed to get container status \"0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4\": rpc error: code = NotFound desc = could not find container \"0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4\": container with ID starting with 0ce98a9b771086a894c793ee04ad25a92fed0c91002a915b8ea50469d77cd0e4 not found: ID does not exist" Dec 04 16:26:09 crc kubenswrapper[4676]: I1204 16:26:09.402079 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" path="/var/lib/kubelet/pods/37c7931a-8949-40c7-8f67-c357a097ed3a/volumes" Dec 04 16:27:16 crc kubenswrapper[4676]: I1204 16:27:16.026740 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:27:16 crc kubenswrapper[4676]: I1204 16:27:16.027508 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:27:46 crc kubenswrapper[4676]: I1204 16:27:46.027206 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:27:46 crc kubenswrapper[4676]: I1204 16:27:46.027920 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:28:16 crc kubenswrapper[4676]: I1204 16:28:16.026265 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:28:16 crc kubenswrapper[4676]: I1204 16:28:16.026974 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:28:16 crc kubenswrapper[4676]: I1204 16:28:16.027055 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:28:16 crc kubenswrapper[4676]: I1204 16:28:16.028082 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a27219e82cb5df25ee12c4a70a158ce63b00fc2e23d5223df55724721043c2d8"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:28:16 crc kubenswrapper[4676]: I1204 16:28:16.028165 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://a27219e82cb5df25ee12c4a70a158ce63b00fc2e23d5223df55724721043c2d8" gracePeriod=600 Dec 04 16:28:17 crc kubenswrapper[4676]: I1204 16:28:17.312143 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="a27219e82cb5df25ee12c4a70a158ce63b00fc2e23d5223df55724721043c2d8" exitCode=0 Dec 04 16:28:17 crc kubenswrapper[4676]: I1204 16:28:17.312227 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"a27219e82cb5df25ee12c4a70a158ce63b00fc2e23d5223df55724721043c2d8"} Dec 04 16:28:17 crc kubenswrapper[4676]: I1204 16:28:17.312571 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8"} Dec 04 16:28:17 crc kubenswrapper[4676]: I1204 16:28:17.312600 4676 scope.go:117] "RemoveContainer" containerID="349195c1bf304b096cddf1556bbb2a5ff97b24b1c635c170b45ce44b32d6d1f3" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.526407 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vvcwq"] Dec 04 16:28:26 crc kubenswrapper[4676]: E1204 16:28:26.527605 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="registry-server" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.527626 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="registry-server" Dec 04 16:28:26 crc kubenswrapper[4676]: E1204 16:28:26.527664 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="extract-content" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.527676 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="extract-content" Dec 04 16:28:26 crc kubenswrapper[4676]: E1204 16:28:26.527717 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="extract-utilities" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.527727 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="extract-utilities" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.528035 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c7931a-8949-40c7-8f67-c357a097ed3a" containerName="registry-server" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.530124 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.552306 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvcwq"] Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.667483 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-catalog-content\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.667888 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8h6b\" (UniqueName: \"kubernetes.io/projected/b861969b-fba0-463e-b2cb-8a5357d2b9a3-kube-api-access-l8h6b\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.668551 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-utilities\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.771752 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8h6b\" (UniqueName: \"kubernetes.io/projected/b861969b-fba0-463e-b2cb-8a5357d2b9a3-kube-api-access-l8h6b\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.771858 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-utilities\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.771952 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-catalog-content\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.772448 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-catalog-content\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.772476 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-utilities\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.796476 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8h6b\" (UniqueName: \"kubernetes.io/projected/b861969b-fba0-463e-b2cb-8a5357d2b9a3-kube-api-access-l8h6b\") pod \"community-operators-vvcwq\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:26 crc kubenswrapper[4676]: I1204 16:28:26.865261 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:27 crc kubenswrapper[4676]: I1204 16:28:27.408297 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vvcwq"] Dec 04 16:28:27 crc kubenswrapper[4676]: I1204 16:28:27.474944 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvcwq" event={"ID":"b861969b-fba0-463e-b2cb-8a5357d2b9a3","Type":"ContainerStarted","Data":"009aca232dfda786b408e9f51282fa160b9da5332af4abc7c920ad2a6b915054"} Dec 04 16:28:28 crc kubenswrapper[4676]: I1204 16:28:28.491865 4676 generic.go:334] "Generic (PLEG): container finished" podID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerID="cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555" exitCode=0 Dec 04 16:28:28 crc kubenswrapper[4676]: I1204 16:28:28.491964 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvcwq" event={"ID":"b861969b-fba0-463e-b2cb-8a5357d2b9a3","Type":"ContainerDied","Data":"cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555"} Dec 04 16:28:28 crc kubenswrapper[4676]: I1204 16:28:28.496510 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:28:30 crc kubenswrapper[4676]: I1204 16:28:30.517322 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvcwq" event={"ID":"b861969b-fba0-463e-b2cb-8a5357d2b9a3","Type":"ContainerStarted","Data":"15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c"} Dec 04 16:28:31 crc kubenswrapper[4676]: I1204 16:28:31.535469 4676 generic.go:334] "Generic (PLEG): container finished" podID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerID="15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c" exitCode=0 Dec 04 16:28:31 crc kubenswrapper[4676]: I1204 16:28:31.539953 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvcwq" event={"ID":"b861969b-fba0-463e-b2cb-8a5357d2b9a3","Type":"ContainerDied","Data":"15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c"} Dec 04 16:28:32 crc kubenswrapper[4676]: I1204 16:28:32.547662 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvcwq" event={"ID":"b861969b-fba0-463e-b2cb-8a5357d2b9a3","Type":"ContainerStarted","Data":"e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee"} Dec 04 16:28:32 crc kubenswrapper[4676]: I1204 16:28:32.583123 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vvcwq" podStartSLOduration=3.067764292 podStartE2EDuration="6.583103343s" podCreationTimestamp="2025-12-04 16:28:26 +0000 UTC" firstStartedPulling="2025-12-04 16:28:28.49577438 +0000 UTC m=+4115.930444277" lastFinishedPulling="2025-12-04 16:28:32.011113471 +0000 UTC m=+4119.445783328" observedRunningTime="2025-12-04 16:28:32.569764882 +0000 UTC m=+4120.004434779" watchObservedRunningTime="2025-12-04 16:28:32.583103343 +0000 UTC m=+4120.017773210" Dec 04 16:28:36 crc kubenswrapper[4676]: I1204 16:28:36.865746 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:36 crc kubenswrapper[4676]: I1204 16:28:36.866158 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:36 crc kubenswrapper[4676]: I1204 16:28:36.935983 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:37 crc kubenswrapper[4676]: I1204 16:28:37.664434 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:37 crc kubenswrapper[4676]: I1204 16:28:37.742233 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvcwq"] Dec 04 16:28:39 crc kubenswrapper[4676]: I1204 16:28:39.622158 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vvcwq" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="registry-server" containerID="cri-o://e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee" gracePeriod=2 Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.127500 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.302801 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-utilities\") pod \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.303002 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-catalog-content\") pod \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.303131 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8h6b\" (UniqueName: \"kubernetes.io/projected/b861969b-fba0-463e-b2cb-8a5357d2b9a3-kube-api-access-l8h6b\") pod \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\" (UID: \"b861969b-fba0-463e-b2cb-8a5357d2b9a3\") " Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.303842 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-utilities" (OuterVolumeSpecName: "utilities") pod "b861969b-fba0-463e-b2cb-8a5357d2b9a3" (UID: "b861969b-fba0-463e-b2cb-8a5357d2b9a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.309459 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b861969b-fba0-463e-b2cb-8a5357d2b9a3-kube-api-access-l8h6b" (OuterVolumeSpecName: "kube-api-access-l8h6b") pod "b861969b-fba0-463e-b2cb-8a5357d2b9a3" (UID: "b861969b-fba0-463e-b2cb-8a5357d2b9a3"). InnerVolumeSpecName "kube-api-access-l8h6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.359889 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b861969b-fba0-463e-b2cb-8a5357d2b9a3" (UID: "b861969b-fba0-463e-b2cb-8a5357d2b9a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.405077 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.405305 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8h6b\" (UniqueName: \"kubernetes.io/projected/b861969b-fba0-463e-b2cb-8a5357d2b9a3-kube-api-access-l8h6b\") on node \"crc\" DevicePath \"\"" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.405374 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b861969b-fba0-463e-b2cb-8a5357d2b9a3-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.636430 4676 generic.go:334] "Generic (PLEG): container finished" podID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerID="e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee" exitCode=0 Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.636487 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvcwq" event={"ID":"b861969b-fba0-463e-b2cb-8a5357d2b9a3","Type":"ContainerDied","Data":"e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee"} Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.636543 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vvcwq" event={"ID":"b861969b-fba0-463e-b2cb-8a5357d2b9a3","Type":"ContainerDied","Data":"009aca232dfda786b408e9f51282fa160b9da5332af4abc7c920ad2a6b915054"} Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.636562 4676 scope.go:117] "RemoveContainer" containerID="e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.636753 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vvcwq" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.679291 4676 scope.go:117] "RemoveContainer" containerID="15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.694847 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vvcwq"] Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.707431 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vvcwq"] Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.721982 4676 scope.go:117] "RemoveContainer" containerID="cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.762265 4676 scope.go:117] "RemoveContainer" containerID="e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee" Dec 04 16:28:40 crc kubenswrapper[4676]: E1204 16:28:40.762979 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee\": container with ID starting with e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee not found: ID does not exist" containerID="e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.763023 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee"} err="failed to get container status \"e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee\": rpc error: code = NotFound desc = could not find container \"e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee\": container with ID starting with e6c099dd88c2524bfeb817e86069239ae6564610af23d8dca836c2360bc809ee not found: ID does not exist" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.763056 4676 scope.go:117] "RemoveContainer" containerID="15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c" Dec 04 16:28:40 crc kubenswrapper[4676]: E1204 16:28:40.763334 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c\": container with ID starting with 15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c not found: ID does not exist" containerID="15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.763373 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c"} err="failed to get container status \"15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c\": rpc error: code = NotFound desc = could not find container \"15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c\": container with ID starting with 15d6bde94d22a0f0bdd783b274b013d76cdc6f4e1215ef02a4a0c6962a72447c not found: ID does not exist" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.763398 4676 scope.go:117] "RemoveContainer" containerID="cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555" Dec 04 16:28:40 crc kubenswrapper[4676]: E1204 16:28:40.765055 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555\": container with ID starting with cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555 not found: ID does not exist" containerID="cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555" Dec 04 16:28:40 crc kubenswrapper[4676]: I1204 16:28:40.765095 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555"} err="failed to get container status \"cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555\": rpc error: code = NotFound desc = could not find container \"cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555\": container with ID starting with cfbb16c253bc2e3bb0f69718b9caaad89572b685a52c15677401bd2a99214555 not found: ID does not exist" Dec 04 16:28:41 crc kubenswrapper[4676]: I1204 16:28:41.405263 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" path="/var/lib/kubelet/pods/b861969b-fba0-463e-b2cb-8a5357d2b9a3/volumes" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.905415 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q75k6"] Dec 04 16:29:00 crc kubenswrapper[4676]: E1204 16:29:00.907805 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="registry-server" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.907954 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="registry-server" Dec 04 16:29:00 crc kubenswrapper[4676]: E1204 16:29:00.908104 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="extract-content" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.908184 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="extract-content" Dec 04 16:29:00 crc kubenswrapper[4676]: E1204 16:29:00.908256 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="extract-utilities" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.908334 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="extract-utilities" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.908659 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b861969b-fba0-463e-b2cb-8a5357d2b9a3" containerName="registry-server" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.910832 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.931199 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q75k6"] Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.994606 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7rl6\" (UniqueName: \"kubernetes.io/projected/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-kube-api-access-m7rl6\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.995306 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-utilities\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:00 crc kubenswrapper[4676]: I1204 16:29:00.995453 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-catalog-content\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.097609 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7rl6\" (UniqueName: \"kubernetes.io/projected/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-kube-api-access-m7rl6\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.097963 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-utilities\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.098035 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-catalog-content\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.098688 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-catalog-content\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.098820 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-utilities\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.124942 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7rl6\" (UniqueName: \"kubernetes.io/projected/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-kube-api-access-m7rl6\") pod \"certified-operators-q75k6\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.254842 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:01 crc kubenswrapper[4676]: I1204 16:29:01.798751 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q75k6"] Dec 04 16:29:02 crc kubenswrapper[4676]: I1204 16:29:02.889480 4676 generic.go:334] "Generic (PLEG): container finished" podID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerID="f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892" exitCode=0 Dec 04 16:29:02 crc kubenswrapper[4676]: I1204 16:29:02.889577 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q75k6" event={"ID":"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e","Type":"ContainerDied","Data":"f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892"} Dec 04 16:29:02 crc kubenswrapper[4676]: I1204 16:29:02.889850 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q75k6" event={"ID":"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e","Type":"ContainerStarted","Data":"b9b48349ffbea14342dac5cfbb931647efc1ebfd4320ebd62492e9f953d0e02a"} Dec 04 16:29:03 crc kubenswrapper[4676]: I1204 16:29:03.903879 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q75k6" event={"ID":"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e","Type":"ContainerStarted","Data":"d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647"} Dec 04 16:29:04 crc kubenswrapper[4676]: I1204 16:29:04.919084 4676 generic.go:334] "Generic (PLEG): container finished" podID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerID="d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647" exitCode=0 Dec 04 16:29:04 crc kubenswrapper[4676]: I1204 16:29:04.919160 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q75k6" event={"ID":"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e","Type":"ContainerDied","Data":"d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647"} Dec 04 16:29:05 crc kubenswrapper[4676]: I1204 16:29:05.934114 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q75k6" event={"ID":"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e","Type":"ContainerStarted","Data":"43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60"} Dec 04 16:29:05 crc kubenswrapper[4676]: I1204 16:29:05.961628 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q75k6" podStartSLOduration=3.513576107 podStartE2EDuration="5.961607887s" podCreationTimestamp="2025-12-04 16:29:00 +0000 UTC" firstStartedPulling="2025-12-04 16:29:02.893246424 +0000 UTC m=+4150.327916291" lastFinishedPulling="2025-12-04 16:29:05.341278204 +0000 UTC m=+4152.775948071" observedRunningTime="2025-12-04 16:29:05.956720458 +0000 UTC m=+4153.391390325" watchObservedRunningTime="2025-12-04 16:29:05.961607887 +0000 UTC m=+4153.396277744" Dec 04 16:29:11 crc kubenswrapper[4676]: I1204 16:29:11.256111 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:11 crc kubenswrapper[4676]: I1204 16:29:11.256635 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:11 crc kubenswrapper[4676]: I1204 16:29:11.451270 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:12 crc kubenswrapper[4676]: I1204 16:29:12.042389 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:12 crc kubenswrapper[4676]: I1204 16:29:12.105437 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q75k6"] Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.010618 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q75k6" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="registry-server" containerID="cri-o://43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60" gracePeriod=2 Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.510839 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.518324 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-utilities\") pod \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.518504 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-catalog-content\") pod \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.518751 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7rl6\" (UniqueName: \"kubernetes.io/projected/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-kube-api-access-m7rl6\") pod \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\" (UID: \"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e\") " Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.519211 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-utilities" (OuterVolumeSpecName: "utilities") pod "d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" (UID: "d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.519527 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.525343 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-kube-api-access-m7rl6" (OuterVolumeSpecName: "kube-api-access-m7rl6") pod "d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" (UID: "d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e"). InnerVolumeSpecName "kube-api-access-m7rl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.585633 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" (UID: "d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.620676 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7rl6\" (UniqueName: \"kubernetes.io/projected/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-kube-api-access-m7rl6\") on node \"crc\" DevicePath \"\"" Dec 04 16:29:14 crc kubenswrapper[4676]: I1204 16:29:14.620716 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.024444 4676 generic.go:334] "Generic (PLEG): container finished" podID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerID="43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60" exitCode=0 Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.024514 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q75k6" event={"ID":"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e","Type":"ContainerDied","Data":"43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60"} Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.024545 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q75k6" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.024834 4676 scope.go:117] "RemoveContainer" containerID="43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.024813 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q75k6" event={"ID":"d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e","Type":"ContainerDied","Data":"b9b48349ffbea14342dac5cfbb931647efc1ebfd4320ebd62492e9f953d0e02a"} Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.067299 4676 scope.go:117] "RemoveContainer" containerID="d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.084655 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q75k6"] Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.101843 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q75k6"] Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.104423 4676 scope.go:117] "RemoveContainer" containerID="f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.136111 4676 scope.go:117] "RemoveContainer" containerID="43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60" Dec 04 16:29:15 crc kubenswrapper[4676]: E1204 16:29:15.136545 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60\": container with ID starting with 43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60 not found: ID does not exist" containerID="43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.136585 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60"} err="failed to get container status \"43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60\": rpc error: code = NotFound desc = could not find container \"43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60\": container with ID starting with 43baa4a1ae65c350b20ceb3c7806d8d507a7ce92ea189dfeb228bde5aec1aa60 not found: ID does not exist" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.136609 4676 scope.go:117] "RemoveContainer" containerID="d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647" Dec 04 16:29:15 crc kubenswrapper[4676]: E1204 16:29:15.136939 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647\": container with ID starting with d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647 not found: ID does not exist" containerID="d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.136998 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647"} err="failed to get container status \"d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647\": rpc error: code = NotFound desc = could not find container \"d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647\": container with ID starting with d95c270a6c0c30f6025185ee9bac02faff931e4459f70f7ba8f7ad15a1140647 not found: ID does not exist" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.137034 4676 scope.go:117] "RemoveContainer" containerID="f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892" Dec 04 16:29:15 crc kubenswrapper[4676]: E1204 16:29:15.137391 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892\": container with ID starting with f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892 not found: ID does not exist" containerID="f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.137439 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892"} err="failed to get container status \"f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892\": rpc error: code = NotFound desc = could not find container \"f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892\": container with ID starting with f4226c520a871ae5d9699e24d47212a22f509eb58bcf0105ceaf07467db34892 not found: ID does not exist" Dec 04 16:29:15 crc kubenswrapper[4676]: I1204 16:29:15.403096 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" path="/var/lib/kubelet/pods/d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e/volumes" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.179553 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d"] Dec 04 16:30:00 crc kubenswrapper[4676]: E1204 16:30:00.180513 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="extract-content" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.180527 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="extract-content" Dec 04 16:30:00 crc kubenswrapper[4676]: E1204 16:30:00.180545 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="extract-utilities" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.180551 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="extract-utilities" Dec 04 16:30:00 crc kubenswrapper[4676]: E1204 16:30:00.180578 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="registry-server" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.180584 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="registry-server" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.180780 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="d604aaeb-1ee1-4d3a-876a-69a38bc8ca7e" containerName="registry-server" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.181713 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.189493 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.190033 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.202326 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d"] Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.350689 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmtc6\" (UniqueName: \"kubernetes.io/projected/e4b41a73-2ae9-479f-8221-f45b7d12766e-kube-api-access-qmtc6\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.350852 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b41a73-2ae9-479f-8221-f45b7d12766e-config-volume\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.350929 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4b41a73-2ae9-479f-8221-f45b7d12766e-secret-volume\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.453097 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b41a73-2ae9-479f-8221-f45b7d12766e-config-volume\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.453437 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4b41a73-2ae9-479f-8221-f45b7d12766e-secret-volume\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.453681 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmtc6\" (UniqueName: \"kubernetes.io/projected/e4b41a73-2ae9-479f-8221-f45b7d12766e-kube-api-access-qmtc6\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.468112 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b41a73-2ae9-479f-8221-f45b7d12766e-config-volume\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.494211 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4b41a73-2ae9-479f-8221-f45b7d12766e-secret-volume\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.494582 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmtc6\" (UniqueName: \"kubernetes.io/projected/e4b41a73-2ae9-479f-8221-f45b7d12766e-kube-api-access-qmtc6\") pod \"collect-profiles-29414430-p7l2d\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.506129 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:00 crc kubenswrapper[4676]: I1204 16:30:00.938530 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d"] Dec 04 16:30:01 crc kubenswrapper[4676]: I1204 16:30:01.558606 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" event={"ID":"e4b41a73-2ae9-479f-8221-f45b7d12766e","Type":"ContainerStarted","Data":"adfd8aa72f8f296841b0e82ff382de8857810f73b6904e1bd35e050add6dfd71"} Dec 04 16:30:02 crc kubenswrapper[4676]: I1204 16:30:02.572898 4676 generic.go:334] "Generic (PLEG): container finished" podID="e4b41a73-2ae9-479f-8221-f45b7d12766e" containerID="b1220a15fe3c09ab082e7ed6c008a25e1d5da2b2a64822cf8e89f37e4bd30d70" exitCode=0 Dec 04 16:30:02 crc kubenswrapper[4676]: I1204 16:30:02.573006 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" event={"ID":"e4b41a73-2ae9-479f-8221-f45b7d12766e","Type":"ContainerDied","Data":"b1220a15fe3c09ab082e7ed6c008a25e1d5da2b2a64822cf8e89f37e4bd30d70"} Dec 04 16:30:03 crc kubenswrapper[4676]: I1204 16:30:03.996158 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.154520 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmtc6\" (UniqueName: \"kubernetes.io/projected/e4b41a73-2ae9-479f-8221-f45b7d12766e-kube-api-access-qmtc6\") pod \"e4b41a73-2ae9-479f-8221-f45b7d12766e\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.154625 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4b41a73-2ae9-479f-8221-f45b7d12766e-secret-volume\") pod \"e4b41a73-2ae9-479f-8221-f45b7d12766e\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.154657 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b41a73-2ae9-479f-8221-f45b7d12766e-config-volume\") pod \"e4b41a73-2ae9-479f-8221-f45b7d12766e\" (UID: \"e4b41a73-2ae9-479f-8221-f45b7d12766e\") " Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.155493 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b41a73-2ae9-479f-8221-f45b7d12766e-config-volume" (OuterVolumeSpecName: "config-volume") pod "e4b41a73-2ae9-479f-8221-f45b7d12766e" (UID: "e4b41a73-2ae9-479f-8221-f45b7d12766e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.170816 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b41a73-2ae9-479f-8221-f45b7d12766e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e4b41a73-2ae9-479f-8221-f45b7d12766e" (UID: "e4b41a73-2ae9-479f-8221-f45b7d12766e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.170863 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b41a73-2ae9-479f-8221-f45b7d12766e-kube-api-access-qmtc6" (OuterVolumeSpecName: "kube-api-access-qmtc6") pod "e4b41a73-2ae9-479f-8221-f45b7d12766e" (UID: "e4b41a73-2ae9-479f-8221-f45b7d12766e"). InnerVolumeSpecName "kube-api-access-qmtc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.257654 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmtc6\" (UniqueName: \"kubernetes.io/projected/e4b41a73-2ae9-479f-8221-f45b7d12766e-kube-api-access-qmtc6\") on node \"crc\" DevicePath \"\"" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.257690 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e4b41a73-2ae9-479f-8221-f45b7d12766e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.257702 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4b41a73-2ae9-479f-8221-f45b7d12766e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.594355 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" event={"ID":"e4b41a73-2ae9-479f-8221-f45b7d12766e","Type":"ContainerDied","Data":"adfd8aa72f8f296841b0e82ff382de8857810f73b6904e1bd35e050add6dfd71"} Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.594674 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adfd8aa72f8f296841b0e82ff382de8857810f73b6904e1bd35e050add6dfd71" Dec 04 16:30:04 crc kubenswrapper[4676]: I1204 16:30:04.594398 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414430-p7l2d" Dec 04 16:30:05 crc kubenswrapper[4676]: I1204 16:30:05.083811 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g"] Dec 04 16:30:05 crc kubenswrapper[4676]: I1204 16:30:05.095126 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414385-9656g"] Dec 04 16:30:05 crc kubenswrapper[4676]: I1204 16:30:05.400624 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20692633-6767-45ee-8e4b-e89de3a134a5" path="/var/lib/kubelet/pods/20692633-6767-45ee-8e4b-e89de3a134a5/volumes" Dec 04 16:30:16 crc kubenswrapper[4676]: I1204 16:30:16.026977 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:30:16 crc kubenswrapper[4676]: I1204 16:30:16.027699 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:30:46 crc kubenswrapper[4676]: I1204 16:30:46.026828 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:30:46 crc kubenswrapper[4676]: I1204 16:30:46.027460 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:31:03 crc kubenswrapper[4676]: I1204 16:31:03.255322 4676 scope.go:117] "RemoveContainer" containerID="9cec22e7763aa207a6df1fdd9de1966b4a24c8a61cdcfd873a14e02da0955f9e" Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.026800 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.027379 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.027442 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.028507 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.028605 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" gracePeriod=600 Dec 04 16:31:16 crc kubenswrapper[4676]: E1204 16:31:16.278392 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.590245 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" exitCode=0 Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.590291 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8"} Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.590325 4676 scope.go:117] "RemoveContainer" containerID="a27219e82cb5df25ee12c4a70a158ce63b00fc2e23d5223df55724721043c2d8" Dec 04 16:31:16 crc kubenswrapper[4676]: I1204 16:31:16.591562 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:31:16 crc kubenswrapper[4676]: E1204 16:31:16.592127 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:31:29 crc kubenswrapper[4676]: I1204 16:31:29.384844 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:31:29 crc kubenswrapper[4676]: E1204 16:31:29.385701 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:31:40 crc kubenswrapper[4676]: I1204 16:31:40.384727 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:31:40 crc kubenswrapper[4676]: E1204 16:31:40.385848 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:31:54 crc kubenswrapper[4676]: I1204 16:31:54.384184 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:31:54 crc kubenswrapper[4676]: E1204 16:31:54.384896 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:32:06 crc kubenswrapper[4676]: I1204 16:32:06.384301 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:32:06 crc kubenswrapper[4676]: E1204 16:32:06.385183 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:32:16 crc kubenswrapper[4676]: E1204 16:32:16.421790 4676 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:54478->38.102.83.158:40877: write tcp 38.102.83.158:54478->38.102.83.158:40877: write: broken pipe Dec 04 16:32:21 crc kubenswrapper[4676]: I1204 16:32:21.386698 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:32:21 crc kubenswrapper[4676]: E1204 16:32:21.387814 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:32:35 crc kubenswrapper[4676]: I1204 16:32:35.385873 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:32:35 crc kubenswrapper[4676]: E1204 16:32:35.387018 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.459855 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5mzt5"] Dec 04 16:32:38 crc kubenswrapper[4676]: E1204 16:32:38.468567 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b41a73-2ae9-479f-8221-f45b7d12766e" containerName="collect-profiles" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.468882 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b41a73-2ae9-479f-8221-f45b7d12766e" containerName="collect-profiles" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.469178 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b41a73-2ae9-479f-8221-f45b7d12766e" containerName="collect-profiles" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.470804 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.495623 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mzt5"] Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.567211 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-utilities\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.567331 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-catalog-content\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.567393 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4fj8\" (UniqueName: \"kubernetes.io/projected/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-kube-api-access-t4fj8\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.669664 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-utilities\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.669781 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-catalog-content\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.669835 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4fj8\" (UniqueName: \"kubernetes.io/projected/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-kube-api-access-t4fj8\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.670410 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-utilities\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.670419 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-catalog-content\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.702164 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4fj8\" (UniqueName: \"kubernetes.io/projected/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-kube-api-access-t4fj8\") pod \"redhat-marketplace-5mzt5\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:38 crc kubenswrapper[4676]: I1204 16:32:38.795260 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:39 crc kubenswrapper[4676]: I1204 16:32:39.306182 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mzt5"] Dec 04 16:32:39 crc kubenswrapper[4676]: I1204 16:32:39.644740 4676 generic.go:334] "Generic (PLEG): container finished" podID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerID="afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9" exitCode=0 Dec 04 16:32:39 crc kubenswrapper[4676]: I1204 16:32:39.645012 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mzt5" event={"ID":"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc","Type":"ContainerDied","Data":"afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9"} Dec 04 16:32:39 crc kubenswrapper[4676]: I1204 16:32:39.645044 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mzt5" event={"ID":"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc","Type":"ContainerStarted","Data":"78026c81464d8dbc8a58ceb82006ac13d9816a21679162744ab714f870a7e8be"} Dec 04 16:32:40 crc kubenswrapper[4676]: I1204 16:32:40.657161 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mzt5" event={"ID":"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc","Type":"ContainerStarted","Data":"f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597"} Dec 04 16:32:41 crc kubenswrapper[4676]: I1204 16:32:41.675887 4676 generic.go:334] "Generic (PLEG): container finished" podID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerID="f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597" exitCode=0 Dec 04 16:32:41 crc kubenswrapper[4676]: I1204 16:32:41.675990 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mzt5" event={"ID":"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc","Type":"ContainerDied","Data":"f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597"} Dec 04 16:32:42 crc kubenswrapper[4676]: I1204 16:32:42.689999 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mzt5" event={"ID":"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc","Type":"ContainerStarted","Data":"86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7"} Dec 04 16:32:42 crc kubenswrapper[4676]: I1204 16:32:42.721533 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5mzt5" podStartSLOduration=2.271338832 podStartE2EDuration="4.721511834s" podCreationTimestamp="2025-12-04 16:32:38 +0000 UTC" firstStartedPulling="2025-12-04 16:32:39.648837067 +0000 UTC m=+4367.083506924" lastFinishedPulling="2025-12-04 16:32:42.099010039 +0000 UTC m=+4369.533679926" observedRunningTime="2025-12-04 16:32:42.710461168 +0000 UTC m=+4370.145131035" watchObservedRunningTime="2025-12-04 16:32:42.721511834 +0000 UTC m=+4370.156181701" Dec 04 16:32:47 crc kubenswrapper[4676]: I1204 16:32:47.384416 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:32:47 crc kubenswrapper[4676]: E1204 16:32:47.384955 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:32:48 crc kubenswrapper[4676]: I1204 16:32:48.796206 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:48 crc kubenswrapper[4676]: I1204 16:32:48.796874 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:48 crc kubenswrapper[4676]: I1204 16:32:48.846828 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:49 crc kubenswrapper[4676]: I1204 16:32:49.832512 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:49 crc kubenswrapper[4676]: I1204 16:32:49.883075 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mzt5"] Dec 04 16:32:51 crc kubenswrapper[4676]: I1204 16:32:51.794964 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5mzt5" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="registry-server" containerID="cri-o://86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7" gracePeriod=2 Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.272252 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.360517 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4fj8\" (UniqueName: \"kubernetes.io/projected/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-kube-api-access-t4fj8\") pod \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.360694 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-utilities\") pod \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.360730 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-catalog-content\") pod \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\" (UID: \"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc\") " Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.361585 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-utilities" (OuterVolumeSpecName: "utilities") pod "ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" (UID: "ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.373580 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-kube-api-access-t4fj8" (OuterVolumeSpecName: "kube-api-access-t4fj8") pod "ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" (UID: "ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc"). InnerVolumeSpecName "kube-api-access-t4fj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.380048 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" (UID: "ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.463649 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.463892 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.463929 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4fj8\" (UniqueName: \"kubernetes.io/projected/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc-kube-api-access-t4fj8\") on node \"crc\" DevicePath \"\"" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.807279 4676 generic.go:334] "Generic (PLEG): container finished" podID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerID="86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7" exitCode=0 Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.807348 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mzt5" event={"ID":"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc","Type":"ContainerDied","Data":"86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7"} Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.807404 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mzt5" event={"ID":"ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc","Type":"ContainerDied","Data":"78026c81464d8dbc8a58ceb82006ac13d9816a21679162744ab714f870a7e8be"} Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.807437 4676 scope.go:117] "RemoveContainer" containerID="86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.807530 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mzt5" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.832248 4676 scope.go:117] "RemoveContainer" containerID="f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.855501 4676 scope.go:117] "RemoveContainer" containerID="afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.862067 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mzt5"] Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.876882 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mzt5"] Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.911376 4676 scope.go:117] "RemoveContainer" containerID="86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7" Dec 04 16:32:52 crc kubenswrapper[4676]: E1204 16:32:52.911857 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7\": container with ID starting with 86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7 not found: ID does not exist" containerID="86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.911920 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7"} err="failed to get container status \"86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7\": rpc error: code = NotFound desc = could not find container \"86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7\": container with ID starting with 86480fedbffaf705039f52f9a7a5fc77b4cf8c063b6112fbf295ed52be631ca7 not found: ID does not exist" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.911941 4676 scope.go:117] "RemoveContainer" containerID="f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597" Dec 04 16:32:52 crc kubenswrapper[4676]: E1204 16:32:52.912172 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597\": container with ID starting with f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597 not found: ID does not exist" containerID="f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.912202 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597"} err="failed to get container status \"f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597\": rpc error: code = NotFound desc = could not find container \"f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597\": container with ID starting with f5e307db8fb126f24fd28af95aea4fec7cb0f6390a8802cdc4ee1e1fc9b59597 not found: ID does not exist" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.912215 4676 scope.go:117] "RemoveContainer" containerID="afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9" Dec 04 16:32:52 crc kubenswrapper[4676]: E1204 16:32:52.912393 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9\": container with ID starting with afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9 not found: ID does not exist" containerID="afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9" Dec 04 16:32:52 crc kubenswrapper[4676]: I1204 16:32:52.912416 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9"} err="failed to get container status \"afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9\": rpc error: code = NotFound desc = could not find container \"afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9\": container with ID starting with afb6d0a60d6b2edaef18f5638fb16fabdab2400b0b302f448ace481ad1712bf9 not found: ID does not exist" Dec 04 16:32:53 crc kubenswrapper[4676]: I1204 16:32:53.398761 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" path="/var/lib/kubelet/pods/ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc/volumes" Dec 04 16:33:02 crc kubenswrapper[4676]: I1204 16:33:02.384813 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:33:02 crc kubenswrapper[4676]: E1204 16:33:02.385631 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:33:13 crc kubenswrapper[4676]: I1204 16:33:13.391775 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:33:13 crc kubenswrapper[4676]: E1204 16:33:13.392543 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:33:26 crc kubenswrapper[4676]: I1204 16:33:26.385020 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:33:26 crc kubenswrapper[4676]: E1204 16:33:26.385890 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:33:41 crc kubenswrapper[4676]: I1204 16:33:41.384290 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:33:41 crc kubenswrapper[4676]: E1204 16:33:41.385071 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:33:55 crc kubenswrapper[4676]: I1204 16:33:55.384919 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:33:55 crc kubenswrapper[4676]: E1204 16:33:55.385725 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:34:06 crc kubenswrapper[4676]: I1204 16:34:06.385216 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:34:06 crc kubenswrapper[4676]: E1204 16:34:06.386178 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:34:19 crc kubenswrapper[4676]: I1204 16:34:19.385311 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:34:19 crc kubenswrapper[4676]: E1204 16:34:19.386181 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:34:34 crc kubenswrapper[4676]: I1204 16:34:34.384322 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:34:34 crc kubenswrapper[4676]: E1204 16:34:34.385098 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:34:45 crc kubenswrapper[4676]: I1204 16:34:45.385069 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:34:45 crc kubenswrapper[4676]: E1204 16:34:45.386817 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:35:00 crc kubenswrapper[4676]: I1204 16:35:00.384462 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:35:00 crc kubenswrapper[4676]: E1204 16:35:00.385307 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:35:14 crc kubenswrapper[4676]: I1204 16:35:14.384963 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:35:14 crc kubenswrapper[4676]: E1204 16:35:14.385866 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:35:27 crc kubenswrapper[4676]: I1204 16:35:27.384689 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:35:27 crc kubenswrapper[4676]: E1204 16:35:27.385400 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:35:39 crc kubenswrapper[4676]: I1204 16:35:39.384412 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:35:39 crc kubenswrapper[4676]: E1204 16:35:39.386179 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.184849 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5bvdx"] Dec 04 16:35:54 crc kubenswrapper[4676]: E1204 16:35:54.186365 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="registry-server" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.186386 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="registry-server" Dec 04 16:35:54 crc kubenswrapper[4676]: E1204 16:35:54.186435 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="extract-content" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.186443 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="extract-content" Dec 04 16:35:54 crc kubenswrapper[4676]: E1204 16:35:54.186462 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="extract-utilities" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.186471 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="extract-utilities" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.186706 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea9040ff-d38d-4b9b-8c6e-11f4e2e24efc" containerName="registry-server" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.188811 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.201483 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bvdx"] Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.272895 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcjcx\" (UniqueName: \"kubernetes.io/projected/c6a027af-52f8-4113-8192-46bc9c0695fe-kube-api-access-xcjcx\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.273066 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-catalog-content\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.273110 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-utilities\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.375305 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcjcx\" (UniqueName: \"kubernetes.io/projected/c6a027af-52f8-4113-8192-46bc9c0695fe-kube-api-access-xcjcx\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.375409 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-catalog-content\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.375462 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-utilities\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.376076 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-utilities\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.376101 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-catalog-content\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.384458 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:35:54 crc kubenswrapper[4676]: E1204 16:35:54.384810 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.416284 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcjcx\" (UniqueName: \"kubernetes.io/projected/c6a027af-52f8-4113-8192-46bc9c0695fe-kube-api-access-xcjcx\") pod \"redhat-operators-5bvdx\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:54 crc kubenswrapper[4676]: I1204 16:35:54.517593 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:35:55 crc kubenswrapper[4676]: I1204 16:35:55.054326 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bvdx"] Dec 04 16:35:55 crc kubenswrapper[4676]: I1204 16:35:55.685415 4676 generic.go:334] "Generic (PLEG): container finished" podID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerID="b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac" exitCode=0 Dec 04 16:35:55 crc kubenswrapper[4676]: I1204 16:35:55.685629 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bvdx" event={"ID":"c6a027af-52f8-4113-8192-46bc9c0695fe","Type":"ContainerDied","Data":"b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac"} Dec 04 16:35:55 crc kubenswrapper[4676]: I1204 16:35:55.686757 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bvdx" event={"ID":"c6a027af-52f8-4113-8192-46bc9c0695fe","Type":"ContainerStarted","Data":"83c36b475cb4bd090c890568308a521a283f1d184a70ed56cfc71e7f700310c6"} Dec 04 16:35:55 crc kubenswrapper[4676]: I1204 16:35:55.687675 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:35:57 crc kubenswrapper[4676]: I1204 16:35:57.709449 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bvdx" event={"ID":"c6a027af-52f8-4113-8192-46bc9c0695fe","Type":"ContainerStarted","Data":"3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11"} Dec 04 16:35:58 crc kubenswrapper[4676]: I1204 16:35:58.719197 4676 generic.go:334] "Generic (PLEG): container finished" podID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerID="3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11" exitCode=0 Dec 04 16:35:58 crc kubenswrapper[4676]: I1204 16:35:58.719237 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bvdx" event={"ID":"c6a027af-52f8-4113-8192-46bc9c0695fe","Type":"ContainerDied","Data":"3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11"} Dec 04 16:36:01 crc kubenswrapper[4676]: I1204 16:36:01.753039 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bvdx" event={"ID":"c6a027af-52f8-4113-8192-46bc9c0695fe","Type":"ContainerStarted","Data":"ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9"} Dec 04 16:36:01 crc kubenswrapper[4676]: I1204 16:36:01.773586 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5bvdx" podStartSLOduration=2.9309028550000003 podStartE2EDuration="7.773558385s" podCreationTimestamp="2025-12-04 16:35:54 +0000 UTC" firstStartedPulling="2025-12-04 16:35:55.687465408 +0000 UTC m=+4563.122135265" lastFinishedPulling="2025-12-04 16:36:00.530120938 +0000 UTC m=+4567.964790795" observedRunningTime="2025-12-04 16:36:01.772742232 +0000 UTC m=+4569.207412099" watchObservedRunningTime="2025-12-04 16:36:01.773558385 +0000 UTC m=+4569.208228262" Dec 04 16:36:04 crc kubenswrapper[4676]: I1204 16:36:04.517718 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:36:04 crc kubenswrapper[4676]: I1204 16:36:04.518302 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:36:05 crc kubenswrapper[4676]: I1204 16:36:05.573359 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5bvdx" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="registry-server" probeResult="failure" output=< Dec 04 16:36:05 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 16:36:05 crc kubenswrapper[4676]: > Dec 04 16:36:08 crc kubenswrapper[4676]: I1204 16:36:08.385041 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:36:08 crc kubenswrapper[4676]: E1204 16:36:08.386094 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:36:14 crc kubenswrapper[4676]: I1204 16:36:14.593156 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:36:14 crc kubenswrapper[4676]: I1204 16:36:14.655762 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:36:14 crc kubenswrapper[4676]: I1204 16:36:14.840483 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bvdx"] Dec 04 16:36:15 crc kubenswrapper[4676]: I1204 16:36:15.940104 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5bvdx" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="registry-server" containerID="cri-o://ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9" gracePeriod=2 Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.747566 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.965559 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcjcx\" (UniqueName: \"kubernetes.io/projected/c6a027af-52f8-4113-8192-46bc9c0695fe-kube-api-access-xcjcx\") pod \"c6a027af-52f8-4113-8192-46bc9c0695fe\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.965869 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-utilities\") pod \"c6a027af-52f8-4113-8192-46bc9c0695fe\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.965951 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-catalog-content\") pod \"c6a027af-52f8-4113-8192-46bc9c0695fe\" (UID: \"c6a027af-52f8-4113-8192-46bc9c0695fe\") " Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.968846 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-utilities" (OuterVolumeSpecName: "utilities") pod "c6a027af-52f8-4113-8192-46bc9c0695fe" (UID: "c6a027af-52f8-4113-8192-46bc9c0695fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.972372 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a027af-52f8-4113-8192-46bc9c0695fe-kube-api-access-xcjcx" (OuterVolumeSpecName: "kube-api-access-xcjcx") pod "c6a027af-52f8-4113-8192-46bc9c0695fe" (UID: "c6a027af-52f8-4113-8192-46bc9c0695fe"). InnerVolumeSpecName "kube-api-access-xcjcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.986512 4676 generic.go:334] "Generic (PLEG): container finished" podID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerID="ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9" exitCode=0 Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.986561 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bvdx" event={"ID":"c6a027af-52f8-4113-8192-46bc9c0695fe","Type":"ContainerDied","Data":"ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9"} Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.986590 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bvdx" event={"ID":"c6a027af-52f8-4113-8192-46bc9c0695fe","Type":"ContainerDied","Data":"83c36b475cb4bd090c890568308a521a283f1d184a70ed56cfc71e7f700310c6"} Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.986606 4676 scope.go:117] "RemoveContainer" containerID="ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9" Dec 04 16:36:16 crc kubenswrapper[4676]: I1204 16:36:16.986752 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bvdx" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.047930 4676 scope.go:117] "RemoveContainer" containerID="3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.072563 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcjcx\" (UniqueName: \"kubernetes.io/projected/c6a027af-52f8-4113-8192-46bc9c0695fe-kube-api-access-xcjcx\") on node \"crc\" DevicePath \"\"" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.072599 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.085788 4676 scope.go:117] "RemoveContainer" containerID="b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.095628 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6a027af-52f8-4113-8192-46bc9c0695fe" (UID: "c6a027af-52f8-4113-8192-46bc9c0695fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.121267 4676 scope.go:117] "RemoveContainer" containerID="ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9" Dec 04 16:36:17 crc kubenswrapper[4676]: E1204 16:36:17.121678 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9\": container with ID starting with ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9 not found: ID does not exist" containerID="ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.121722 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9"} err="failed to get container status \"ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9\": rpc error: code = NotFound desc = could not find container \"ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9\": container with ID starting with ef8eb10420ca5407b8cd79d870a9470c6b2d002ac16ca29ef7503db8ad2a12a9 not found: ID does not exist" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.121748 4676 scope.go:117] "RemoveContainer" containerID="3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11" Dec 04 16:36:17 crc kubenswrapper[4676]: E1204 16:36:17.122581 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11\": container with ID starting with 3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11 not found: ID does not exist" containerID="3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.122626 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11"} err="failed to get container status \"3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11\": rpc error: code = NotFound desc = could not find container \"3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11\": container with ID starting with 3b906dd256eb9d04d5c49271f745efc8f2a260a30fff671fc9fac7e8f7fd9b11 not found: ID does not exist" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.122640 4676 scope.go:117] "RemoveContainer" containerID="b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac" Dec 04 16:36:17 crc kubenswrapper[4676]: E1204 16:36:17.122850 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac\": container with ID starting with b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac not found: ID does not exist" containerID="b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.122874 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac"} err="failed to get container status \"b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac\": rpc error: code = NotFound desc = could not find container \"b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac\": container with ID starting with b96fc03a4d5ff34b4828e4271e361725436765ec3e8fc3067ad33047e50c95ac not found: ID does not exist" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.175925 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6a027af-52f8-4113-8192-46bc9c0695fe-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.325636 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bvdx"] Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.334289 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5bvdx"] Dec 04 16:36:17 crc kubenswrapper[4676]: I1204 16:36:17.396629 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" path="/var/lib/kubelet/pods/c6a027af-52f8-4113-8192-46bc9c0695fe/volumes" Dec 04 16:36:23 crc kubenswrapper[4676]: I1204 16:36:23.391318 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:36:24 crc kubenswrapper[4676]: I1204 16:36:24.220612 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"111dc5dc62868d29b06759400f350cf381dea26ee6ac59555c0ad9280f51a7d5"} Dec 04 16:38:46 crc kubenswrapper[4676]: I1204 16:38:46.026656 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:38:46 crc kubenswrapper[4676]: I1204 16:38:46.027313 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.461063 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x2wn5"] Dec 04 16:39:09 crc kubenswrapper[4676]: E1204 16:39:09.462230 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="registry-server" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.462268 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="registry-server" Dec 04 16:39:09 crc kubenswrapper[4676]: E1204 16:39:09.462284 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="extract-content" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.462293 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="extract-content" Dec 04 16:39:09 crc kubenswrapper[4676]: E1204 16:39:09.462325 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="extract-utilities" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.462332 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="extract-utilities" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.462565 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a027af-52f8-4113-8192-46bc9c0695fe" containerName="registry-server" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.464509 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.472128 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x2wn5"] Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.545239 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-catalog-content\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.545767 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr245\" (UniqueName: \"kubernetes.io/projected/efdd437f-7cf6-440b-8692-ba579583dc4d-kube-api-access-fr245\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.546051 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-utilities\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.648775 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr245\" (UniqueName: \"kubernetes.io/projected/efdd437f-7cf6-440b-8692-ba579583dc4d-kube-api-access-fr245\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.648865 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-utilities\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.648970 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-catalog-content\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.649354 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-utilities\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.649583 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-catalog-content\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.681856 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr245\" (UniqueName: \"kubernetes.io/projected/efdd437f-7cf6-440b-8692-ba579583dc4d-kube-api-access-fr245\") pod \"community-operators-x2wn5\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:09 crc kubenswrapper[4676]: I1204 16:39:09.791956 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:10 crc kubenswrapper[4676]: I1204 16:39:10.399330 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x2wn5"] Dec 04 16:39:11 crc kubenswrapper[4676]: I1204 16:39:11.406443 4676 generic.go:334] "Generic (PLEG): container finished" podID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerID="2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7" exitCode=0 Dec 04 16:39:11 crc kubenswrapper[4676]: I1204 16:39:11.406535 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2wn5" event={"ID":"efdd437f-7cf6-440b-8692-ba579583dc4d","Type":"ContainerDied","Data":"2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7"} Dec 04 16:39:11 crc kubenswrapper[4676]: I1204 16:39:11.406750 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2wn5" event={"ID":"efdd437f-7cf6-440b-8692-ba579583dc4d","Type":"ContainerStarted","Data":"b0ca486af9b2c7c8f6ed3246f758b1712a7a7feb26f6bf79b95ebbf0e07ee0f6"} Dec 04 16:39:14 crc kubenswrapper[4676]: I1204 16:39:14.436110 4676 generic.go:334] "Generic (PLEG): container finished" podID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerID="ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16" exitCode=0 Dec 04 16:39:14 crc kubenswrapper[4676]: I1204 16:39:14.436179 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2wn5" event={"ID":"efdd437f-7cf6-440b-8692-ba579583dc4d","Type":"ContainerDied","Data":"ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16"} Dec 04 16:39:16 crc kubenswrapper[4676]: I1204 16:39:16.026623 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:39:16 crc kubenswrapper[4676]: I1204 16:39:16.026957 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:39:18 crc kubenswrapper[4676]: I1204 16:39:18.480157 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2wn5" event={"ID":"efdd437f-7cf6-440b-8692-ba579583dc4d","Type":"ContainerStarted","Data":"017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712"} Dec 04 16:39:18 crc kubenswrapper[4676]: I1204 16:39:18.527467 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x2wn5" podStartSLOduration=5.30866616 podStartE2EDuration="9.527442295s" podCreationTimestamp="2025-12-04 16:39:09 +0000 UTC" firstStartedPulling="2025-12-04 16:39:12.419934776 +0000 UTC m=+4759.854604633" lastFinishedPulling="2025-12-04 16:39:16.638710911 +0000 UTC m=+4764.073380768" observedRunningTime="2025-12-04 16:39:18.504093048 +0000 UTC m=+4765.938762905" watchObservedRunningTime="2025-12-04 16:39:18.527442295 +0000 UTC m=+4765.962112162" Dec 04 16:39:19 crc kubenswrapper[4676]: I1204 16:39:19.793198 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:19 crc kubenswrapper[4676]: I1204 16:39:19.793596 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:19 crc kubenswrapper[4676]: I1204 16:39:19.867114 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:29 crc kubenswrapper[4676]: I1204 16:39:29.853884 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:29 crc kubenswrapper[4676]: I1204 16:39:29.911688 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x2wn5"] Dec 04 16:39:30 crc kubenswrapper[4676]: I1204 16:39:30.595116 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x2wn5" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="registry-server" containerID="cri-o://017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712" gracePeriod=2 Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.613660 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.617450 4676 generic.go:334] "Generic (PLEG): container finished" podID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerID="017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712" exitCode=0 Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.617501 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2wn5" event={"ID":"efdd437f-7cf6-440b-8692-ba579583dc4d","Type":"ContainerDied","Data":"017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712"} Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.617536 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x2wn5" event={"ID":"efdd437f-7cf6-440b-8692-ba579583dc4d","Type":"ContainerDied","Data":"b0ca486af9b2c7c8f6ed3246f758b1712a7a7feb26f6bf79b95ebbf0e07ee0f6"} Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.617558 4676 scope.go:117] "RemoveContainer" containerID="017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.650678 4676 scope.go:117] "RemoveContainer" containerID="ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.686857 4676 scope.go:117] "RemoveContainer" containerID="2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.746852 4676 scope.go:117] "RemoveContainer" containerID="017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712" Dec 04 16:39:31 crc kubenswrapper[4676]: E1204 16:39:31.747292 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712\": container with ID starting with 017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712 not found: ID does not exist" containerID="017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.747342 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712"} err="failed to get container status \"017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712\": rpc error: code = NotFound desc = could not find container \"017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712\": container with ID starting with 017b97781b48ba90bbf7249e898d855fc541ecf14579b2b16c29f34bbb002712 not found: ID does not exist" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.747372 4676 scope.go:117] "RemoveContainer" containerID="ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16" Dec 04 16:39:31 crc kubenswrapper[4676]: E1204 16:39:31.747731 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16\": container with ID starting with ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16 not found: ID does not exist" containerID="ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.747758 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16"} err="failed to get container status \"ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16\": rpc error: code = NotFound desc = could not find container \"ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16\": container with ID starting with ed978d9c76df64c39cae1d6c94b5b3be8bdd89f7074f60e7d096286b188efb16 not found: ID does not exist" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.747772 4676 scope.go:117] "RemoveContainer" containerID="2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7" Dec 04 16:39:31 crc kubenswrapper[4676]: E1204 16:39:31.748288 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7\": container with ID starting with 2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7 not found: ID does not exist" containerID="2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.748356 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7"} err="failed to get container status \"2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7\": rpc error: code = NotFound desc = could not find container \"2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7\": container with ID starting with 2c669e7f65ebebd6905f019e126600fbc10649171529771f4689415af2cf2ea7 not found: ID does not exist" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.815454 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr245\" (UniqueName: \"kubernetes.io/projected/efdd437f-7cf6-440b-8692-ba579583dc4d-kube-api-access-fr245\") pod \"efdd437f-7cf6-440b-8692-ba579583dc4d\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.816017 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-catalog-content\") pod \"efdd437f-7cf6-440b-8692-ba579583dc4d\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.816101 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-utilities\") pod \"efdd437f-7cf6-440b-8692-ba579583dc4d\" (UID: \"efdd437f-7cf6-440b-8692-ba579583dc4d\") " Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.817011 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-utilities" (OuterVolumeSpecName: "utilities") pod "efdd437f-7cf6-440b-8692-ba579583dc4d" (UID: "efdd437f-7cf6-440b-8692-ba579583dc4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.824106 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd437f-7cf6-440b-8692-ba579583dc4d-kube-api-access-fr245" (OuterVolumeSpecName: "kube-api-access-fr245") pod "efdd437f-7cf6-440b-8692-ba579583dc4d" (UID: "efdd437f-7cf6-440b-8692-ba579583dc4d"). InnerVolumeSpecName "kube-api-access-fr245". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.874199 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efdd437f-7cf6-440b-8692-ba579583dc4d" (UID: "efdd437f-7cf6-440b-8692-ba579583dc4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.918711 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.918744 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr245\" (UniqueName: \"kubernetes.io/projected/efdd437f-7cf6-440b-8692-ba579583dc4d-kube-api-access-fr245\") on node \"crc\" DevicePath \"\"" Dec 04 16:39:31 crc kubenswrapper[4676]: I1204 16:39:31.918755 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efdd437f-7cf6-440b-8692-ba579583dc4d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:39:32 crc kubenswrapper[4676]: I1204 16:39:32.627534 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x2wn5" Dec 04 16:39:32 crc kubenswrapper[4676]: I1204 16:39:32.677834 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x2wn5"] Dec 04 16:39:32 crc kubenswrapper[4676]: I1204 16:39:32.691823 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x2wn5"] Dec 04 16:39:33 crc kubenswrapper[4676]: I1204 16:39:33.402983 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" path="/var/lib/kubelet/pods/efdd437f-7cf6-440b-8692-ba579583dc4d/volumes" Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.026555 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.027076 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.027132 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.027734 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"111dc5dc62868d29b06759400f350cf381dea26ee6ac59555c0ad9280f51a7d5"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.027799 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://111dc5dc62868d29b06759400f350cf381dea26ee6ac59555c0ad9280f51a7d5" gracePeriod=600 Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.777600 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="111dc5dc62868d29b06759400f350cf381dea26ee6ac59555c0ad9280f51a7d5" exitCode=0 Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.777669 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"111dc5dc62868d29b06759400f350cf381dea26ee6ac59555c0ad9280f51a7d5"} Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.777966 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb"} Dec 04 16:39:46 crc kubenswrapper[4676]: I1204 16:39:46.778008 4676 scope.go:117] "RemoveContainer" containerID="5e7c54badcb16de03d4d77a894d88dbba4c0b9504f104fde7e6bec061f8432c8" Dec 04 16:39:48 crc kubenswrapper[4676]: I1204 16:39:48.916431 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7q2k6"] Dec 04 16:39:48 crc kubenswrapper[4676]: E1204 16:39:48.917483 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="registry-server" Dec 04 16:39:48 crc kubenswrapper[4676]: I1204 16:39:48.917502 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="registry-server" Dec 04 16:39:48 crc kubenswrapper[4676]: E1204 16:39:48.917536 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="extract-content" Dec 04 16:39:48 crc kubenswrapper[4676]: I1204 16:39:48.917545 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="extract-content" Dec 04 16:39:48 crc kubenswrapper[4676]: E1204 16:39:48.917573 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="extract-utilities" Dec 04 16:39:48 crc kubenswrapper[4676]: I1204 16:39:48.917581 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="extract-utilities" Dec 04 16:39:48 crc kubenswrapper[4676]: I1204 16:39:48.917840 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="efdd437f-7cf6-440b-8692-ba579583dc4d" containerName="registry-server" Dec 04 16:39:48 crc kubenswrapper[4676]: I1204 16:39:48.920017 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:48 crc kubenswrapper[4676]: I1204 16:39:48.934047 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7q2k6"] Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.102647 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-utilities\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.102775 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzzt\" (UniqueName: \"kubernetes.io/projected/87365df3-35d4-42eb-b7db-aded30ba8e90-kube-api-access-pjzzt\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.102839 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-catalog-content\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.204623 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-utilities\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.204769 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzzt\" (UniqueName: \"kubernetes.io/projected/87365df3-35d4-42eb-b7db-aded30ba8e90-kube-api-access-pjzzt\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.204845 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-catalog-content\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.205324 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-utilities\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.205371 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-catalog-content\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.252164 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzzt\" (UniqueName: \"kubernetes.io/projected/87365df3-35d4-42eb-b7db-aded30ba8e90-kube-api-access-pjzzt\") pod \"certified-operators-7q2k6\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.255634 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:49 crc kubenswrapper[4676]: I1204 16:39:49.830848 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7q2k6"] Dec 04 16:39:50 crc kubenswrapper[4676]: I1204 16:39:50.823791 4676 generic.go:334] "Generic (PLEG): container finished" podID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerID="b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d" exitCode=0 Dec 04 16:39:50 crc kubenswrapper[4676]: I1204 16:39:50.824225 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q2k6" event={"ID":"87365df3-35d4-42eb-b7db-aded30ba8e90","Type":"ContainerDied","Data":"b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d"} Dec 04 16:39:50 crc kubenswrapper[4676]: I1204 16:39:50.824612 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q2k6" event={"ID":"87365df3-35d4-42eb-b7db-aded30ba8e90","Type":"ContainerStarted","Data":"ca78e77b6104e157a6e67c7827d3baff4c46cb79032a8ca1ae0458f5475b00b5"} Dec 04 16:39:52 crc kubenswrapper[4676]: I1204 16:39:52.846511 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q2k6" event={"ID":"87365df3-35d4-42eb-b7db-aded30ba8e90","Type":"ContainerStarted","Data":"e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df"} Dec 04 16:39:53 crc kubenswrapper[4676]: I1204 16:39:53.856559 4676 generic.go:334] "Generic (PLEG): container finished" podID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerID="e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df" exitCode=0 Dec 04 16:39:53 crc kubenswrapper[4676]: I1204 16:39:53.856603 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q2k6" event={"ID":"87365df3-35d4-42eb-b7db-aded30ba8e90","Type":"ContainerDied","Data":"e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df"} Dec 04 16:39:54 crc kubenswrapper[4676]: I1204 16:39:54.868862 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q2k6" event={"ID":"87365df3-35d4-42eb-b7db-aded30ba8e90","Type":"ContainerStarted","Data":"fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936"} Dec 04 16:39:54 crc kubenswrapper[4676]: I1204 16:39:54.898973 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7q2k6" podStartSLOduration=3.411127114 podStartE2EDuration="6.898949132s" podCreationTimestamp="2025-12-04 16:39:48 +0000 UTC" firstStartedPulling="2025-12-04 16:39:50.828988435 +0000 UTC m=+4798.263658292" lastFinishedPulling="2025-12-04 16:39:54.316810453 +0000 UTC m=+4801.751480310" observedRunningTime="2025-12-04 16:39:54.891538933 +0000 UTC m=+4802.326208800" watchObservedRunningTime="2025-12-04 16:39:54.898949132 +0000 UTC m=+4802.333618989" Dec 04 16:39:59 crc kubenswrapper[4676]: I1204 16:39:59.256589 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:39:59 crc kubenswrapper[4676]: I1204 16:39:59.256934 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:40:11 crc kubenswrapper[4676]: I1204 16:40:11.055572 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:40:11 crc kubenswrapper[4676]: I1204 16:40:11.276254 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:40:11 crc kubenswrapper[4676]: I1204 16:40:11.345941 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7q2k6"] Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.151660 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7q2k6" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="registry-server" containerID="cri-o://fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936" gracePeriod=2 Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.633762 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.733103 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-utilities\") pod \"87365df3-35d4-42eb-b7db-aded30ba8e90\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.733460 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzzt\" (UniqueName: \"kubernetes.io/projected/87365df3-35d4-42eb-b7db-aded30ba8e90-kube-api-access-pjzzt\") pod \"87365df3-35d4-42eb-b7db-aded30ba8e90\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.733536 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-catalog-content\") pod \"87365df3-35d4-42eb-b7db-aded30ba8e90\" (UID: \"87365df3-35d4-42eb-b7db-aded30ba8e90\") " Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.734192 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-utilities" (OuterVolumeSpecName: "utilities") pod "87365df3-35d4-42eb-b7db-aded30ba8e90" (UID: "87365df3-35d4-42eb-b7db-aded30ba8e90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.781480 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87365df3-35d4-42eb-b7db-aded30ba8e90" (UID: "87365df3-35d4-42eb-b7db-aded30ba8e90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.835972 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:40:12 crc kubenswrapper[4676]: I1204 16:40:12.836016 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87365df3-35d4-42eb-b7db-aded30ba8e90-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.163278 4676 generic.go:334] "Generic (PLEG): container finished" podID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerID="fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936" exitCode=0 Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.163360 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7q2k6" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.163358 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q2k6" event={"ID":"87365df3-35d4-42eb-b7db-aded30ba8e90","Type":"ContainerDied","Data":"fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936"} Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.163539 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7q2k6" event={"ID":"87365df3-35d4-42eb-b7db-aded30ba8e90","Type":"ContainerDied","Data":"ca78e77b6104e157a6e67c7827d3baff4c46cb79032a8ca1ae0458f5475b00b5"} Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.163597 4676 scope.go:117] "RemoveContainer" containerID="fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.185332 4676 scope.go:117] "RemoveContainer" containerID="e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.302637 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87365df3-35d4-42eb-b7db-aded30ba8e90-kube-api-access-pjzzt" (OuterVolumeSpecName: "kube-api-access-pjzzt") pod "87365df3-35d4-42eb-b7db-aded30ba8e90" (UID: "87365df3-35d4-42eb-b7db-aded30ba8e90"). InnerVolumeSpecName "kube-api-access-pjzzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.327974 4676 scope.go:117] "RemoveContainer" containerID="b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.346001 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjzzt\" (UniqueName: \"kubernetes.io/projected/87365df3-35d4-42eb-b7db-aded30ba8e90-kube-api-access-pjzzt\") on node \"crc\" DevicePath \"\"" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.435308 4676 scope.go:117] "RemoveContainer" containerID="fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936" Dec 04 16:40:13 crc kubenswrapper[4676]: E1204 16:40:13.437594 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936\": container with ID starting with fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936 not found: ID does not exist" containerID="fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.437657 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936"} err="failed to get container status \"fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936\": rpc error: code = NotFound desc = could not find container \"fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936\": container with ID starting with fb2bc9c657bbe97abb30103b06f2b2cbf04c2b5da4b9925c147a54d65dc5b936 not found: ID does not exist" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.437711 4676 scope.go:117] "RemoveContainer" containerID="e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df" Dec 04 16:40:13 crc kubenswrapper[4676]: E1204 16:40:13.438370 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df\": container with ID starting with e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df not found: ID does not exist" containerID="e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.438515 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df"} err="failed to get container status \"e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df\": rpc error: code = NotFound desc = could not find container \"e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df\": container with ID starting with e61b0fb59d52daa543b978aef0ef92b70ded764d83bfcdf1ad47336ca242b2df not found: ID does not exist" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.438602 4676 scope.go:117] "RemoveContainer" containerID="b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d" Dec 04 16:40:13 crc kubenswrapper[4676]: E1204 16:40:13.439280 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d\": container with ID starting with b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d not found: ID does not exist" containerID="b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.439319 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d"} err="failed to get container status \"b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d\": rpc error: code = NotFound desc = could not find container \"b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d\": container with ID starting with b6069fc15950d52017ac7cb5d3cdff278ca784a285f9b825bc3596812778945d not found: ID does not exist" Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.494881 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7q2k6"] Dec 04 16:40:13 crc kubenswrapper[4676]: I1204 16:40:13.510212 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7q2k6"] Dec 04 16:40:15 crc kubenswrapper[4676]: I1204 16:40:15.395780 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" path="/var/lib/kubelet/pods/87365df3-35d4-42eb-b7db-aded30ba8e90/volumes" Dec 04 16:41:46 crc kubenswrapper[4676]: I1204 16:41:46.026806 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:41:46 crc kubenswrapper[4676]: I1204 16:41:46.027356 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:42:16 crc kubenswrapper[4676]: I1204 16:42:16.026407 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:42:16 crc kubenswrapper[4676]: I1204 16:42:16.026971 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.794438 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pblzr"] Dec 04 16:42:44 crc kubenswrapper[4676]: E1204 16:42:44.795447 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="extract-content" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.795476 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="extract-content" Dec 04 16:42:44 crc kubenswrapper[4676]: E1204 16:42:44.795514 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="extract-utilities" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.795523 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="extract-utilities" Dec 04 16:42:44 crc kubenswrapper[4676]: E1204 16:42:44.795551 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="registry-server" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.795560 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="registry-server" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.795839 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="87365df3-35d4-42eb-b7db-aded30ba8e90" containerName="registry-server" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.797628 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.817444 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pblzr"] Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.908445 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-catalog-content\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.908505 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-utilities\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:44 crc kubenswrapper[4676]: I1204 16:42:44.908639 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4msc\" (UniqueName: \"kubernetes.io/projected/5146cd1f-b9bd-4a09-bd37-d683174bc548-kube-api-access-j4msc\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.011257 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-catalog-content\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.011308 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-utilities\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.011354 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4msc\" (UniqueName: \"kubernetes.io/projected/5146cd1f-b9bd-4a09-bd37-d683174bc548-kube-api-access-j4msc\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.012142 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-catalog-content\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.012372 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-utilities\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.045247 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4msc\" (UniqueName: \"kubernetes.io/projected/5146cd1f-b9bd-4a09-bd37-d683174bc548-kube-api-access-j4msc\") pod \"redhat-marketplace-pblzr\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.130681 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:45 crc kubenswrapper[4676]: I1204 16:42:45.633842 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pblzr"] Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.027017 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.027097 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.027152 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.028238 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.028335 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" gracePeriod=600 Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.549827 4676 generic.go:334] "Generic (PLEG): container finished" podID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerID="bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1" exitCode=0 Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.549948 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pblzr" event={"ID":"5146cd1f-b9bd-4a09-bd37-d683174bc548","Type":"ContainerDied","Data":"bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1"} Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.550568 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pblzr" event={"ID":"5146cd1f-b9bd-4a09-bd37-d683174bc548","Type":"ContainerStarted","Data":"f9632ebaae9514b04f11bf58f2efb145fe154dc973cc6764ac0fcd75cee63965"} Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.553733 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.559327 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" exitCode=0 Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.559389 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb"} Dec 04 16:42:46 crc kubenswrapper[4676]: I1204 16:42:46.559481 4676 scope.go:117] "RemoveContainer" containerID="111dc5dc62868d29b06759400f350cf381dea26ee6ac59555c0ad9280f51a7d5" Dec 04 16:42:46 crc kubenswrapper[4676]: E1204 16:42:46.819733 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:42:47 crc kubenswrapper[4676]: I1204 16:42:47.574665 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:42:47 crc kubenswrapper[4676]: E1204 16:42:47.575368 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:42:48 crc kubenswrapper[4676]: I1204 16:42:48.584285 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pblzr" event={"ID":"5146cd1f-b9bd-4a09-bd37-d683174bc548","Type":"ContainerStarted","Data":"2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c"} Dec 04 16:42:49 crc kubenswrapper[4676]: I1204 16:42:49.605979 4676 generic.go:334] "Generic (PLEG): container finished" podID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerID="2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c" exitCode=0 Dec 04 16:42:49 crc kubenswrapper[4676]: I1204 16:42:49.606044 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pblzr" event={"ID":"5146cd1f-b9bd-4a09-bd37-d683174bc548","Type":"ContainerDied","Data":"2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c"} Dec 04 16:42:50 crc kubenswrapper[4676]: I1204 16:42:50.618941 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pblzr" event={"ID":"5146cd1f-b9bd-4a09-bd37-d683174bc548","Type":"ContainerStarted","Data":"d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8"} Dec 04 16:42:50 crc kubenswrapper[4676]: I1204 16:42:50.640998 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pblzr" podStartSLOduration=3.157474628 podStartE2EDuration="6.640958674s" podCreationTimestamp="2025-12-04 16:42:44 +0000 UTC" firstStartedPulling="2025-12-04 16:42:46.552967158 +0000 UTC m=+4973.987637035" lastFinishedPulling="2025-12-04 16:42:50.036451234 +0000 UTC m=+4977.471121081" observedRunningTime="2025-12-04 16:42:50.638495154 +0000 UTC m=+4978.073165021" watchObservedRunningTime="2025-12-04 16:42:50.640958674 +0000 UTC m=+4978.075628541" Dec 04 16:42:55 crc kubenswrapper[4676]: I1204 16:42:55.130852 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:55 crc kubenswrapper[4676]: I1204 16:42:55.131404 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:55 crc kubenswrapper[4676]: I1204 16:42:55.186449 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:55 crc kubenswrapper[4676]: I1204 16:42:55.728893 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:55 crc kubenswrapper[4676]: I1204 16:42:55.785277 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pblzr"] Dec 04 16:42:57 crc kubenswrapper[4676]: I1204 16:42:57.691471 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pblzr" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="registry-server" containerID="cri-o://d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8" gracePeriod=2 Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.259442 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.384295 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:42:58 crc kubenswrapper[4676]: E1204 16:42:58.384629 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.406344 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-utilities\") pod \"5146cd1f-b9bd-4a09-bd37-d683174bc548\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.406512 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-catalog-content\") pod \"5146cd1f-b9bd-4a09-bd37-d683174bc548\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.406621 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4msc\" (UniqueName: \"kubernetes.io/projected/5146cd1f-b9bd-4a09-bd37-d683174bc548-kube-api-access-j4msc\") pod \"5146cd1f-b9bd-4a09-bd37-d683174bc548\" (UID: \"5146cd1f-b9bd-4a09-bd37-d683174bc548\") " Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.407771 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-utilities" (OuterVolumeSpecName: "utilities") pod "5146cd1f-b9bd-4a09-bd37-d683174bc548" (UID: "5146cd1f-b9bd-4a09-bd37-d683174bc548"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.419289 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5146cd1f-b9bd-4a09-bd37-d683174bc548-kube-api-access-j4msc" (OuterVolumeSpecName: "kube-api-access-j4msc") pod "5146cd1f-b9bd-4a09-bd37-d683174bc548" (UID: "5146cd1f-b9bd-4a09-bd37-d683174bc548"). InnerVolumeSpecName "kube-api-access-j4msc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.431097 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5146cd1f-b9bd-4a09-bd37-d683174bc548" (UID: "5146cd1f-b9bd-4a09-bd37-d683174bc548"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.511438 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.511969 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4msc\" (UniqueName: \"kubernetes.io/projected/5146cd1f-b9bd-4a09-bd37-d683174bc548-kube-api-access-j4msc\") on node \"crc\" DevicePath \"\"" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.511985 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5146cd1f-b9bd-4a09-bd37-d683174bc548-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.702320 4676 generic.go:334] "Generic (PLEG): container finished" podID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerID="d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8" exitCode=0 Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.702397 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pblzr" event={"ID":"5146cd1f-b9bd-4a09-bd37-d683174bc548","Type":"ContainerDied","Data":"d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8"} Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.703080 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pblzr" event={"ID":"5146cd1f-b9bd-4a09-bd37-d683174bc548","Type":"ContainerDied","Data":"f9632ebaae9514b04f11bf58f2efb145fe154dc973cc6764ac0fcd75cee63965"} Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.703113 4676 scope.go:117] "RemoveContainer" containerID="d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.702520 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pblzr" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.744864 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pblzr"] Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.748107 4676 scope.go:117] "RemoveContainer" containerID="2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.758448 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pblzr"] Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.779961 4676 scope.go:117] "RemoveContainer" containerID="bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.839456 4676 scope.go:117] "RemoveContainer" containerID="d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8" Dec 04 16:42:58 crc kubenswrapper[4676]: E1204 16:42:58.840012 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8\": container with ID starting with d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8 not found: ID does not exist" containerID="d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.840430 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8"} err="failed to get container status \"d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8\": rpc error: code = NotFound desc = could not find container \"d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8\": container with ID starting with d8bdca6642061fa97a7c9bf96b50dc2b220525c54c7b6bcf09e6e2af8d7f31d8 not found: ID does not exist" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.840509 4676 scope.go:117] "RemoveContainer" containerID="2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c" Dec 04 16:42:58 crc kubenswrapper[4676]: E1204 16:42:58.841291 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c\": container with ID starting with 2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c not found: ID does not exist" containerID="2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.841344 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c"} err="failed to get container status \"2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c\": rpc error: code = NotFound desc = could not find container \"2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c\": container with ID starting with 2f9c7b3a8d5e38023298fbc9a05788d83c4180d650caeaa91442bee174ceaf5c not found: ID does not exist" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.841376 4676 scope.go:117] "RemoveContainer" containerID="bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1" Dec 04 16:42:58 crc kubenswrapper[4676]: E1204 16:42:58.841724 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1\": container with ID starting with bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1 not found: ID does not exist" containerID="bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1" Dec 04 16:42:58 crc kubenswrapper[4676]: I1204 16:42:58.841761 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1"} err="failed to get container status \"bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1\": rpc error: code = NotFound desc = could not find container \"bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1\": container with ID starting with bd9bb42eac3032354c2ca6e8da2b4e09dab5acc0b5c883222ba428bf817773a1 not found: ID does not exist" Dec 04 16:42:59 crc kubenswrapper[4676]: I1204 16:42:59.397233 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" path="/var/lib/kubelet/pods/5146cd1f-b9bd-4a09-bd37-d683174bc548/volumes" Dec 04 16:43:10 crc kubenswrapper[4676]: I1204 16:43:10.384846 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:43:10 crc kubenswrapper[4676]: E1204 16:43:10.385954 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:43:22 crc kubenswrapper[4676]: I1204 16:43:22.385581 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:43:22 crc kubenswrapper[4676]: E1204 16:43:22.386504 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:43:35 crc kubenswrapper[4676]: I1204 16:43:35.384833 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:43:35 crc kubenswrapper[4676]: E1204 16:43:35.385818 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:43:48 crc kubenswrapper[4676]: I1204 16:43:48.384635 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:43:48 crc kubenswrapper[4676]: E1204 16:43:48.385575 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:44:02 crc kubenswrapper[4676]: I1204 16:44:02.384389 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:44:02 crc kubenswrapper[4676]: E1204 16:44:02.386385 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:44:16 crc kubenswrapper[4676]: I1204 16:44:16.384559 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:44:16 crc kubenswrapper[4676]: E1204 16:44:16.385440 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:44:28 crc kubenswrapper[4676]: I1204 16:44:28.385192 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:44:28 crc kubenswrapper[4676]: E1204 16:44:28.385960 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:44:42 crc kubenswrapper[4676]: I1204 16:44:42.384895 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:44:42 crc kubenswrapper[4676]: E1204 16:44:42.385642 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:44:56 crc kubenswrapper[4676]: I1204 16:44:56.384971 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:44:56 crc kubenswrapper[4676]: E1204 16:44:56.385871 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.152272 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654"] Dec 04 16:45:00 crc kubenswrapper[4676]: E1204 16:45:00.154051 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="extract-content" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.154094 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="extract-content" Dec 04 16:45:00 crc kubenswrapper[4676]: E1204 16:45:00.154159 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="extract-utilities" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.154174 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="extract-utilities" Dec 04 16:45:00 crc kubenswrapper[4676]: E1204 16:45:00.154211 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="registry-server" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.154227 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="registry-server" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.154811 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5146cd1f-b9bd-4a09-bd37-d683174bc548" containerName="registry-server" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.156609 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.162893 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.162956 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.166193 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654"] Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.342138 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c85ee235-cc6e-403a-86db-abacfd662bae-config-volume\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.342279 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkd9s\" (UniqueName: \"kubernetes.io/projected/c85ee235-cc6e-403a-86db-abacfd662bae-kube-api-access-wkd9s\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.342566 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c85ee235-cc6e-403a-86db-abacfd662bae-secret-volume\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.445377 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkd9s\" (UniqueName: \"kubernetes.io/projected/c85ee235-cc6e-403a-86db-abacfd662bae-kube-api-access-wkd9s\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.445737 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c85ee235-cc6e-403a-86db-abacfd662bae-secret-volume\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.445834 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c85ee235-cc6e-403a-86db-abacfd662bae-config-volume\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.448801 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c85ee235-cc6e-403a-86db-abacfd662bae-config-volume\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.457879 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c85ee235-cc6e-403a-86db-abacfd662bae-secret-volume\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.472879 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkd9s\" (UniqueName: \"kubernetes.io/projected/c85ee235-cc6e-403a-86db-abacfd662bae-kube-api-access-wkd9s\") pod \"collect-profiles-29414445-zr654\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.493745 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:00 crc kubenswrapper[4676]: I1204 16:45:00.953237 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654"] Dec 04 16:45:01 crc kubenswrapper[4676]: I1204 16:45:01.009180 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" event={"ID":"c85ee235-cc6e-403a-86db-abacfd662bae","Type":"ContainerStarted","Data":"5f1872fe1123af74e67b493662015ca375975ae48bfd9f0927138fefe47fb9a8"} Dec 04 16:45:02 crc kubenswrapper[4676]: I1204 16:45:02.020553 4676 generic.go:334] "Generic (PLEG): container finished" podID="c85ee235-cc6e-403a-86db-abacfd662bae" containerID="a0bdf160ccd59218477fbed08a01e26e8b2b57a9db547120030bdbf5df38966d" exitCode=0 Dec 04 16:45:02 crc kubenswrapper[4676]: I1204 16:45:02.020882 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" event={"ID":"c85ee235-cc6e-403a-86db-abacfd662bae","Type":"ContainerDied","Data":"a0bdf160ccd59218477fbed08a01e26e8b2b57a9db547120030bdbf5df38966d"} Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.616943 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.728275 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkd9s\" (UniqueName: \"kubernetes.io/projected/c85ee235-cc6e-403a-86db-abacfd662bae-kube-api-access-wkd9s\") pod \"c85ee235-cc6e-403a-86db-abacfd662bae\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.728533 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c85ee235-cc6e-403a-86db-abacfd662bae-config-volume\") pod \"c85ee235-cc6e-403a-86db-abacfd662bae\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.728701 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c85ee235-cc6e-403a-86db-abacfd662bae-secret-volume\") pod \"c85ee235-cc6e-403a-86db-abacfd662bae\" (UID: \"c85ee235-cc6e-403a-86db-abacfd662bae\") " Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.729189 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c85ee235-cc6e-403a-86db-abacfd662bae-config-volume" (OuterVolumeSpecName: "config-volume") pod "c85ee235-cc6e-403a-86db-abacfd662bae" (UID: "c85ee235-cc6e-403a-86db-abacfd662bae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.734063 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c85ee235-cc6e-403a-86db-abacfd662bae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c85ee235-cc6e-403a-86db-abacfd662bae" (UID: "c85ee235-cc6e-403a-86db-abacfd662bae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.735894 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85ee235-cc6e-403a-86db-abacfd662bae-kube-api-access-wkd9s" (OuterVolumeSpecName: "kube-api-access-wkd9s") pod "c85ee235-cc6e-403a-86db-abacfd662bae" (UID: "c85ee235-cc6e-403a-86db-abacfd662bae"). InnerVolumeSpecName "kube-api-access-wkd9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.831332 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c85ee235-cc6e-403a-86db-abacfd662bae-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.831376 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c85ee235-cc6e-403a-86db-abacfd662bae-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 16:45:03 crc kubenswrapper[4676]: I1204 16:45:03.831394 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkd9s\" (UniqueName: \"kubernetes.io/projected/c85ee235-cc6e-403a-86db-abacfd662bae-kube-api-access-wkd9s\") on node \"crc\" DevicePath \"\"" Dec 04 16:45:04 crc kubenswrapper[4676]: I1204 16:45:04.039353 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" event={"ID":"c85ee235-cc6e-403a-86db-abacfd662bae","Type":"ContainerDied","Data":"5f1872fe1123af74e67b493662015ca375975ae48bfd9f0927138fefe47fb9a8"} Dec 04 16:45:04 crc kubenswrapper[4676]: I1204 16:45:04.039400 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f1872fe1123af74e67b493662015ca375975ae48bfd9f0927138fefe47fb9a8" Dec 04 16:45:04 crc kubenswrapper[4676]: I1204 16:45:04.039451 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414445-zr654" Dec 04 16:45:04 crc kubenswrapper[4676]: I1204 16:45:04.706162 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h"] Dec 04 16:45:04 crc kubenswrapper[4676]: I1204 16:45:04.715555 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414400-5pk7h"] Dec 04 16:45:05 crc kubenswrapper[4676]: I1204 16:45:05.394893 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af" path="/var/lib/kubelet/pods/03fe0d9d-b6b4-4751-a7d6-6fe9c4b6e9af/volumes" Dec 04 16:45:09 crc kubenswrapper[4676]: I1204 16:45:09.385930 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:45:09 crc kubenswrapper[4676]: E1204 16:45:09.386841 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:45:24 crc kubenswrapper[4676]: I1204 16:45:24.384818 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:45:24 crc kubenswrapper[4676]: E1204 16:45:24.385727 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:45:38 crc kubenswrapper[4676]: I1204 16:45:38.385061 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:45:38 crc kubenswrapper[4676]: E1204 16:45:38.386013 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:45:51 crc kubenswrapper[4676]: I1204 16:45:51.385099 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:45:51 crc kubenswrapper[4676]: E1204 16:45:51.386571 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:46:03 crc kubenswrapper[4676]: I1204 16:46:03.765209 4676 scope.go:117] "RemoveContainer" containerID="9dda80fe98e231dfc7a509531a2d66e41ec11f3b9e9d18959523984a81530edf" Dec 04 16:46:04 crc kubenswrapper[4676]: I1204 16:46:04.384664 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:46:04 crc kubenswrapper[4676]: E1204 16:46:04.385438 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:46:15 crc kubenswrapper[4676]: I1204 16:46:15.385486 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:46:15 crc kubenswrapper[4676]: E1204 16:46:15.386229 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:46:28 crc kubenswrapper[4676]: I1204 16:46:28.385420 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:46:28 crc kubenswrapper[4676]: E1204 16:46:28.386276 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:46:43 crc kubenswrapper[4676]: I1204 16:46:43.399399 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:46:43 crc kubenswrapper[4676]: E1204 16:46:43.402289 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:46:56 crc kubenswrapper[4676]: I1204 16:46:56.385278 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:46:56 crc kubenswrapper[4676]: E1204 16:46:56.386247 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:47:08 crc kubenswrapper[4676]: I1204 16:47:08.384330 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:47:08 crc kubenswrapper[4676]: E1204 16:47:08.385100 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:47:19 crc kubenswrapper[4676]: I1204 16:47:19.385616 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:47:19 crc kubenswrapper[4676]: E1204 16:47:19.386516 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:47:31 crc kubenswrapper[4676]: I1204 16:47:31.386386 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:47:31 crc kubenswrapper[4676]: E1204 16:47:31.387436 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:47:43 crc kubenswrapper[4676]: I1204 16:47:43.391259 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:47:43 crc kubenswrapper[4676]: E1204 16:47:43.392458 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:47:58 crc kubenswrapper[4676]: I1204 16:47:58.385342 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:47:58 crc kubenswrapper[4676]: I1204 16:47:58.730494 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"2be211197532ccc2ed17c6a9af3cfe8084e22ef83a0ba97237e594e45e560a82"} Dec 04 16:49:00 crc kubenswrapper[4676]: I1204 16:49:00.883844 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m8289"] Dec 04 16:49:00 crc kubenswrapper[4676]: E1204 16:49:00.886118 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85ee235-cc6e-403a-86db-abacfd662bae" containerName="collect-profiles" Dec 04 16:49:00 crc kubenswrapper[4676]: I1204 16:49:00.886224 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85ee235-cc6e-403a-86db-abacfd662bae" containerName="collect-profiles" Dec 04 16:49:00 crc kubenswrapper[4676]: I1204 16:49:00.886518 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85ee235-cc6e-403a-86db-abacfd662bae" containerName="collect-profiles" Dec 04 16:49:00 crc kubenswrapper[4676]: I1204 16:49:00.888279 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:00 crc kubenswrapper[4676]: I1204 16:49:00.894064 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8289"] Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.042154 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-catalog-content\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.042288 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-utilities\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.042337 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4zgk\" (UniqueName: \"kubernetes.io/projected/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-kube-api-access-m4zgk\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.144724 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4zgk\" (UniqueName: \"kubernetes.io/projected/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-kube-api-access-m4zgk\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.144887 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-catalog-content\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.145005 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-utilities\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.145436 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-catalog-content\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.145514 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-utilities\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.165891 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4zgk\" (UniqueName: \"kubernetes.io/projected/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-kube-api-access-m4zgk\") pod \"redhat-operators-m8289\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.214550 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:01 crc kubenswrapper[4676]: I1204 16:49:01.699155 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m8289"] Dec 04 16:49:02 crc kubenswrapper[4676]: I1204 16:49:02.386233 4676 generic.go:334] "Generic (PLEG): container finished" podID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerID="c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e" exitCode=0 Dec 04 16:49:02 crc kubenswrapper[4676]: I1204 16:49:02.386332 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8289" event={"ID":"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f","Type":"ContainerDied","Data":"c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e"} Dec 04 16:49:02 crc kubenswrapper[4676]: I1204 16:49:02.386673 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8289" event={"ID":"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f","Type":"ContainerStarted","Data":"80ea8c5c68e30f93fab6bd108d46812798467b72bec48e0a9eb9f4fe472a943a"} Dec 04 16:49:02 crc kubenswrapper[4676]: I1204 16:49:02.387706 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:49:03 crc kubenswrapper[4676]: I1204 16:49:03.408269 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8289" event={"ID":"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f","Type":"ContainerStarted","Data":"ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da"} Dec 04 16:49:06 crc kubenswrapper[4676]: I1204 16:49:06.431141 4676 generic.go:334] "Generic (PLEG): container finished" podID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerID="ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da" exitCode=0 Dec 04 16:49:06 crc kubenswrapper[4676]: I1204 16:49:06.431255 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8289" event={"ID":"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f","Type":"ContainerDied","Data":"ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da"} Dec 04 16:49:07 crc kubenswrapper[4676]: I1204 16:49:07.443798 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8289" event={"ID":"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f","Type":"ContainerStarted","Data":"8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3"} Dec 04 16:49:07 crc kubenswrapper[4676]: I1204 16:49:07.468229 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m8289" podStartSLOduration=2.996346819 podStartE2EDuration="7.46821061s" podCreationTimestamp="2025-12-04 16:49:00 +0000 UTC" firstStartedPulling="2025-12-04 16:49:02.387473325 +0000 UTC m=+5349.822143182" lastFinishedPulling="2025-12-04 16:49:06.859337116 +0000 UTC m=+5354.294006973" observedRunningTime="2025-12-04 16:49:07.46109789 +0000 UTC m=+5354.895767767" watchObservedRunningTime="2025-12-04 16:49:07.46821061 +0000 UTC m=+5354.902880467" Dec 04 16:49:11 crc kubenswrapper[4676]: I1204 16:49:11.215200 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:11 crc kubenswrapper[4676]: I1204 16:49:11.215984 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:12 crc kubenswrapper[4676]: I1204 16:49:12.264273 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m8289" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="registry-server" probeResult="failure" output=< Dec 04 16:49:12 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 16:49:12 crc kubenswrapper[4676]: > Dec 04 16:49:21 crc kubenswrapper[4676]: I1204 16:49:21.269429 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:21 crc kubenswrapper[4676]: I1204 16:49:21.333522 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:21 crc kubenswrapper[4676]: I1204 16:49:21.514477 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m8289"] Dec 04 16:49:22 crc kubenswrapper[4676]: I1204 16:49:22.599075 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-m8289" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="registry-server" containerID="cri-o://8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3" gracePeriod=2 Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.258078 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.357603 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-utilities\") pod \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.357819 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-catalog-content\") pod \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.358019 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4zgk\" (UniqueName: \"kubernetes.io/projected/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-kube-api-access-m4zgk\") pod \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\" (UID: \"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f\") " Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.359221 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-utilities" (OuterVolumeSpecName: "utilities") pod "9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" (UID: "9a7fbf96-5872-478e-a7b4-9a8d8c797e5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.370070 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-kube-api-access-m4zgk" (OuterVolumeSpecName: "kube-api-access-m4zgk") pod "9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" (UID: "9a7fbf96-5872-478e-a7b4-9a8d8c797e5f"). InnerVolumeSpecName "kube-api-access-m4zgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.462600 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4zgk\" (UniqueName: \"kubernetes.io/projected/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-kube-api-access-m4zgk\") on node \"crc\" DevicePath \"\"" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.462644 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.506064 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" (UID: "9a7fbf96-5872-478e-a7b4-9a8d8c797e5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.565081 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.615871 4676 generic.go:334] "Generic (PLEG): container finished" podID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerID="8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3" exitCode=0 Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.615946 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8289" event={"ID":"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f","Type":"ContainerDied","Data":"8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3"} Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.615966 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m8289" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.616008 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m8289" event={"ID":"9a7fbf96-5872-478e-a7b4-9a8d8c797e5f","Type":"ContainerDied","Data":"80ea8c5c68e30f93fab6bd108d46812798467b72bec48e0a9eb9f4fe472a943a"} Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.616047 4676 scope.go:117] "RemoveContainer" containerID="8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.652967 4676 scope.go:117] "RemoveContainer" containerID="ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.665803 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-m8289"] Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.676755 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-m8289"] Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.680173 4676 scope.go:117] "RemoveContainer" containerID="c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.732810 4676 scope.go:117] "RemoveContainer" containerID="8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3" Dec 04 16:49:23 crc kubenswrapper[4676]: E1204 16:49:23.733351 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3\": container with ID starting with 8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3 not found: ID does not exist" containerID="8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.733387 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3"} err="failed to get container status \"8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3\": rpc error: code = NotFound desc = could not find container \"8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3\": container with ID starting with 8678ee72d9cfd131dd7d31492c2388b3e1e2bf027f85fa65302882819f3f0fa3 not found: ID does not exist" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.733410 4676 scope.go:117] "RemoveContainer" containerID="ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da" Dec 04 16:49:23 crc kubenswrapper[4676]: E1204 16:49:23.733716 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da\": container with ID starting with ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da not found: ID does not exist" containerID="ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.733769 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da"} err="failed to get container status \"ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da\": rpc error: code = NotFound desc = could not find container \"ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da\": container with ID starting with ef7f0d137ea0cf22e75704ec13a87f0b6354dbf626c07b91ef74989738dde2da not found: ID does not exist" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.733801 4676 scope.go:117] "RemoveContainer" containerID="c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e" Dec 04 16:49:23 crc kubenswrapper[4676]: E1204 16:49:23.734112 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e\": container with ID starting with c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e not found: ID does not exist" containerID="c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e" Dec 04 16:49:23 crc kubenswrapper[4676]: I1204 16:49:23.734138 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e"} err="failed to get container status \"c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e\": rpc error: code = NotFound desc = could not find container \"c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e\": container with ID starting with c8276f73e35ff9536412182df797420c06cd73e715aa4b9e49252aee5103416e not found: ID does not exist" Dec 04 16:49:25 crc kubenswrapper[4676]: I1204 16:49:25.395265 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" path="/var/lib/kubelet/pods/9a7fbf96-5872-478e-a7b4-9a8d8c797e5f/volumes" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.887420 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7k8fb"] Dec 04 16:49:31 crc kubenswrapper[4676]: E1204 16:49:31.888618 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="extract-utilities" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.888700 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="extract-utilities" Dec 04 16:49:31 crc kubenswrapper[4676]: E1204 16:49:31.888725 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="registry-server" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.888732 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="registry-server" Dec 04 16:49:31 crc kubenswrapper[4676]: E1204 16:49:31.888767 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="extract-content" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.888777 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="extract-content" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.889030 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7fbf96-5872-478e-a7b4-9a8d8c797e5f" containerName="registry-server" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.890734 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.909417 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7k8fb"] Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.957062 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-utilities\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.957549 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-catalog-content\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:31 crc kubenswrapper[4676]: I1204 16:49:31.957760 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855jd\" (UniqueName: \"kubernetes.io/projected/a4b15191-0597-4437-9693-2632039a229e-kube-api-access-855jd\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.059594 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-utilities\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.059729 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-catalog-content\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.059791 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855jd\" (UniqueName: \"kubernetes.io/projected/a4b15191-0597-4437-9693-2632039a229e-kube-api-access-855jd\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.060326 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-utilities\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.060348 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-catalog-content\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.081733 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855jd\" (UniqueName: \"kubernetes.io/projected/a4b15191-0597-4437-9693-2632039a229e-kube-api-access-855jd\") pod \"community-operators-7k8fb\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.216067 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:32 crc kubenswrapper[4676]: W1204 16:49:32.770035 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4b15191_0597_4437_9693_2632039a229e.slice/crio-4c8b5b1b9e6e39ebfbcd064812db45c4a15dbd7d9e86d971bd6ab49e4767743f WatchSource:0}: Error finding container 4c8b5b1b9e6e39ebfbcd064812db45c4a15dbd7d9e86d971bd6ab49e4767743f: Status 404 returned error can't find the container with id 4c8b5b1b9e6e39ebfbcd064812db45c4a15dbd7d9e86d971bd6ab49e4767743f Dec 04 16:49:32 crc kubenswrapper[4676]: I1204 16:49:32.775474 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7k8fb"] Dec 04 16:49:33 crc kubenswrapper[4676]: I1204 16:49:33.736598 4676 generic.go:334] "Generic (PLEG): container finished" podID="a4b15191-0597-4437-9693-2632039a229e" containerID="72baf09e1329af14761b0bd232815bb08bc2890df36472740b9f2da58a541f32" exitCode=0 Dec 04 16:49:33 crc kubenswrapper[4676]: I1204 16:49:33.736654 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k8fb" event={"ID":"a4b15191-0597-4437-9693-2632039a229e","Type":"ContainerDied","Data":"72baf09e1329af14761b0bd232815bb08bc2890df36472740b9f2da58a541f32"} Dec 04 16:49:33 crc kubenswrapper[4676]: I1204 16:49:33.737364 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k8fb" event={"ID":"a4b15191-0597-4437-9693-2632039a229e","Type":"ContainerStarted","Data":"4c8b5b1b9e6e39ebfbcd064812db45c4a15dbd7d9e86d971bd6ab49e4767743f"} Dec 04 16:49:35 crc kubenswrapper[4676]: I1204 16:49:35.763176 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k8fb" event={"ID":"a4b15191-0597-4437-9693-2632039a229e","Type":"ContainerStarted","Data":"08cc70531aae29489f138d2c47935bb5815388a9d2663e525946ec26f2186b9c"} Dec 04 16:49:36 crc kubenswrapper[4676]: I1204 16:49:36.775410 4676 generic.go:334] "Generic (PLEG): container finished" podID="a4b15191-0597-4437-9693-2632039a229e" containerID="08cc70531aae29489f138d2c47935bb5815388a9d2663e525946ec26f2186b9c" exitCode=0 Dec 04 16:49:36 crc kubenswrapper[4676]: I1204 16:49:36.775517 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k8fb" event={"ID":"a4b15191-0597-4437-9693-2632039a229e","Type":"ContainerDied","Data":"08cc70531aae29489f138d2c47935bb5815388a9d2663e525946ec26f2186b9c"} Dec 04 16:49:37 crc kubenswrapper[4676]: I1204 16:49:37.791404 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k8fb" event={"ID":"a4b15191-0597-4437-9693-2632039a229e","Type":"ContainerStarted","Data":"f49776e029db8e17111cb994898f805938786aafe1e925cbbeb14899bfe70b08"} Dec 04 16:49:37 crc kubenswrapper[4676]: I1204 16:49:37.820722 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7k8fb" podStartSLOduration=3.305654495 podStartE2EDuration="6.820703001s" podCreationTimestamp="2025-12-04 16:49:31 +0000 UTC" firstStartedPulling="2025-12-04 16:49:33.73974187 +0000 UTC m=+5381.174411737" lastFinishedPulling="2025-12-04 16:49:37.254790386 +0000 UTC m=+5384.689460243" observedRunningTime="2025-12-04 16:49:37.815657079 +0000 UTC m=+5385.250326956" watchObservedRunningTime="2025-12-04 16:49:37.820703001 +0000 UTC m=+5385.255372858" Dec 04 16:49:42 crc kubenswrapper[4676]: I1204 16:49:42.217297 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:42 crc kubenswrapper[4676]: I1204 16:49:42.218007 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:42 crc kubenswrapper[4676]: I1204 16:49:42.270364 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:42 crc kubenswrapper[4676]: I1204 16:49:42.904222 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:42 crc kubenswrapper[4676]: I1204 16:49:42.955070 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7k8fb"] Dec 04 16:49:44 crc kubenswrapper[4676]: I1204 16:49:44.867529 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7k8fb" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="registry-server" containerID="cri-o://f49776e029db8e17111cb994898f805938786aafe1e925cbbeb14899bfe70b08" gracePeriod=2 Dec 04 16:49:45 crc kubenswrapper[4676]: I1204 16:49:45.879307 4676 generic.go:334] "Generic (PLEG): container finished" podID="a4b15191-0597-4437-9693-2632039a229e" containerID="f49776e029db8e17111cb994898f805938786aafe1e925cbbeb14899bfe70b08" exitCode=0 Dec 04 16:49:45 crc kubenswrapper[4676]: I1204 16:49:45.879357 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k8fb" event={"ID":"a4b15191-0597-4437-9693-2632039a229e","Type":"ContainerDied","Data":"f49776e029db8e17111cb994898f805938786aafe1e925cbbeb14899bfe70b08"} Dec 04 16:49:45 crc kubenswrapper[4676]: I1204 16:49:45.879704 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7k8fb" event={"ID":"a4b15191-0597-4437-9693-2632039a229e","Type":"ContainerDied","Data":"4c8b5b1b9e6e39ebfbcd064812db45c4a15dbd7d9e86d971bd6ab49e4767743f"} Dec 04 16:49:45 crc kubenswrapper[4676]: I1204 16:49:45.879721 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8b5b1b9e6e39ebfbcd064812db45c4a15dbd7d9e86d971bd6ab49e4767743f" Dec 04 16:49:45 crc kubenswrapper[4676]: I1204 16:49:45.923593 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.058939 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-855jd\" (UniqueName: \"kubernetes.io/projected/a4b15191-0597-4437-9693-2632039a229e-kube-api-access-855jd\") pod \"a4b15191-0597-4437-9693-2632039a229e\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.059261 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-utilities\") pod \"a4b15191-0597-4437-9693-2632039a229e\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.059340 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-catalog-content\") pod \"a4b15191-0597-4437-9693-2632039a229e\" (UID: \"a4b15191-0597-4437-9693-2632039a229e\") " Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.059988 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-utilities" (OuterVolumeSpecName: "utilities") pod "a4b15191-0597-4437-9693-2632039a229e" (UID: "a4b15191-0597-4437-9693-2632039a229e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.064769 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b15191-0597-4437-9693-2632039a229e-kube-api-access-855jd" (OuterVolumeSpecName: "kube-api-access-855jd") pod "a4b15191-0597-4437-9693-2632039a229e" (UID: "a4b15191-0597-4437-9693-2632039a229e"). InnerVolumeSpecName "kube-api-access-855jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.127840 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4b15191-0597-4437-9693-2632039a229e" (UID: "a4b15191-0597-4437-9693-2632039a229e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.161573 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-855jd\" (UniqueName: \"kubernetes.io/projected/a4b15191-0597-4437-9693-2632039a229e-kube-api-access-855jd\") on node \"crc\" DevicePath \"\"" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.161612 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.161626 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b15191-0597-4437-9693-2632039a229e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.888369 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7k8fb" Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.934627 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7k8fb"] Dec 04 16:49:46 crc kubenswrapper[4676]: I1204 16:49:46.944443 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7k8fb"] Dec 04 16:49:47 crc kubenswrapper[4676]: I1204 16:49:47.396382 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b15191-0597-4437-9693-2632039a229e" path="/var/lib/kubelet/pods/a4b15191-0597-4437-9693-2632039a229e/volumes" Dec 04 16:50:16 crc kubenswrapper[4676]: I1204 16:50:16.027446 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:50:16 crc kubenswrapper[4676]: I1204 16:50:16.028106 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:50:46 crc kubenswrapper[4676]: I1204 16:50:46.026930 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:50:46 crc kubenswrapper[4676]: I1204 16:50:46.027467 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.026773 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.027466 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.027516 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.028384 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2be211197532ccc2ed17c6a9af3cfe8084e22ef83a0ba97237e594e45e560a82"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.028435 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://2be211197532ccc2ed17c6a9af3cfe8084e22ef83a0ba97237e594e45e560a82" gracePeriod=600 Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.991844 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="2be211197532ccc2ed17c6a9af3cfe8084e22ef83a0ba97237e594e45e560a82" exitCode=0 Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.991924 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"2be211197532ccc2ed17c6a9af3cfe8084e22ef83a0ba97237e594e45e560a82"} Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.992294 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c"} Dec 04 16:51:16 crc kubenswrapper[4676]: I1204 16:51:16.992327 4676 scope.go:117] "RemoveContainer" containerID="d6e6a8160dc29f480447749bfbf113eb512f47bd97fa4f18d8d0e2a5585bf1fb" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.041435 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-csmsh"] Dec 04 16:53:08 crc kubenswrapper[4676]: E1204 16:53:08.042694 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="extract-content" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.042719 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="extract-content" Dec 04 16:53:08 crc kubenswrapper[4676]: E1204 16:53:08.042758 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="registry-server" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.042765 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="registry-server" Dec 04 16:53:08 crc kubenswrapper[4676]: E1204 16:53:08.042793 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="extract-utilities" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.042800 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="extract-utilities" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.043091 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b15191-0597-4437-9693-2632039a229e" containerName="registry-server" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.045147 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.067846 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-csmsh"] Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.207061 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zf7q\" (UniqueName: \"kubernetes.io/projected/3d5c92e1-8000-4f13-8480-1e099388fbfa-kube-api-access-7zf7q\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.207169 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5c92e1-8000-4f13-8480-1e099388fbfa-catalog-content\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.207189 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5c92e1-8000-4f13-8480-1e099388fbfa-utilities\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.310070 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zf7q\" (UniqueName: \"kubernetes.io/projected/3d5c92e1-8000-4f13-8480-1e099388fbfa-kube-api-access-7zf7q\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.310802 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5c92e1-8000-4f13-8480-1e099388fbfa-catalog-content\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.311337 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3d5c92e1-8000-4f13-8480-1e099388fbfa-catalog-content\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.311404 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5c92e1-8000-4f13-8480-1e099388fbfa-utilities\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.312107 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3d5c92e1-8000-4f13-8480-1e099388fbfa-utilities\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.339717 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zf7q\" (UniqueName: \"kubernetes.io/projected/3d5c92e1-8000-4f13-8480-1e099388fbfa-kube-api-access-7zf7q\") pod \"certified-operators-csmsh\" (UID: \"3d5c92e1-8000-4f13-8480-1e099388fbfa\") " pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:08 crc kubenswrapper[4676]: I1204 16:53:08.385828 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:09 crc kubenswrapper[4676]: I1204 16:53:09.058284 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-csmsh"] Dec 04 16:53:09 crc kubenswrapper[4676]: I1204 16:53:09.121964 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csmsh" event={"ID":"3d5c92e1-8000-4f13-8480-1e099388fbfa","Type":"ContainerStarted","Data":"1a080feab06ded4d42bde30440ab16cdb5a68fbc662caf69dfa113b192f80a78"} Dec 04 16:53:10 crc kubenswrapper[4676]: I1204 16:53:10.137408 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csmsh" event={"ID":"3d5c92e1-8000-4f13-8480-1e099388fbfa","Type":"ContainerStarted","Data":"05555a8fa182e2dde7a009c8fdab709d4c7d774074c7f5e6fa28997f81452f2b"} Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.148590 4676 generic.go:334] "Generic (PLEG): container finished" podID="3d5c92e1-8000-4f13-8480-1e099388fbfa" containerID="05555a8fa182e2dde7a009c8fdab709d4c7d774074c7f5e6fa28997f81452f2b" exitCode=0 Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.148639 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csmsh" event={"ID":"3d5c92e1-8000-4f13-8480-1e099388fbfa","Type":"ContainerDied","Data":"05555a8fa182e2dde7a009c8fdab709d4c7d774074c7f5e6fa28997f81452f2b"} Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.236205 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qjc9p"] Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.238443 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.261695 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjc9p"] Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.278469 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9gj8\" (UniqueName: \"kubernetes.io/projected/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-kube-api-access-g9gj8\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.278582 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-catalog-content\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.278784 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-utilities\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.380340 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-catalog-content\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.380786 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-utilities\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.380820 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-catalog-content\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.380893 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9gj8\" (UniqueName: \"kubernetes.io/projected/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-kube-api-access-g9gj8\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.381126 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-utilities\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.709747 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9gj8\" (UniqueName: \"kubernetes.io/projected/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-kube-api-access-g9gj8\") pod \"redhat-marketplace-qjc9p\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:11 crc kubenswrapper[4676]: I1204 16:53:11.867344 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:12 crc kubenswrapper[4676]: I1204 16:53:12.389868 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjc9p"] Dec 04 16:53:13 crc kubenswrapper[4676]: I1204 16:53:13.173642 4676 generic.go:334] "Generic (PLEG): container finished" podID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerID="c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af" exitCode=0 Dec 04 16:53:13 crc kubenswrapper[4676]: I1204 16:53:13.173720 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjc9p" event={"ID":"e28e5149-d14b-4abf-ac30-b2b8f71e50ae","Type":"ContainerDied","Data":"c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af"} Dec 04 16:53:13 crc kubenswrapper[4676]: I1204 16:53:13.174039 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjc9p" event={"ID":"e28e5149-d14b-4abf-ac30-b2b8f71e50ae","Type":"ContainerStarted","Data":"5c2fdb789410079a13cbac63f1d2355fd7396a34434301614768158da2e1e5ea"} Dec 04 16:53:16 crc kubenswrapper[4676]: I1204 16:53:16.027057 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:53:16 crc kubenswrapper[4676]: I1204 16:53:16.027531 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:53:21 crc kubenswrapper[4676]: I1204 16:53:21.248964 4676 generic.go:334] "Generic (PLEG): container finished" podID="3d5c92e1-8000-4f13-8480-1e099388fbfa" containerID="5a9a8a03207154e4c7ab43d125704cca755f7b20c8cf617196d563c17f5cd271" exitCode=0 Dec 04 16:53:21 crc kubenswrapper[4676]: I1204 16:53:21.249117 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csmsh" event={"ID":"3d5c92e1-8000-4f13-8480-1e099388fbfa","Type":"ContainerDied","Data":"5a9a8a03207154e4c7ab43d125704cca755f7b20c8cf617196d563c17f5cd271"} Dec 04 16:53:21 crc kubenswrapper[4676]: I1204 16:53:21.252075 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjc9p" event={"ID":"e28e5149-d14b-4abf-ac30-b2b8f71e50ae","Type":"ContainerStarted","Data":"d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000"} Dec 04 16:53:22 crc kubenswrapper[4676]: I1204 16:53:22.261994 4676 generic.go:334] "Generic (PLEG): container finished" podID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerID="d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000" exitCode=0 Dec 04 16:53:22 crc kubenswrapper[4676]: I1204 16:53:22.262594 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjc9p" event={"ID":"e28e5149-d14b-4abf-ac30-b2b8f71e50ae","Type":"ContainerDied","Data":"d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000"} Dec 04 16:53:23 crc kubenswrapper[4676]: I1204 16:53:23.275919 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-csmsh" event={"ID":"3d5c92e1-8000-4f13-8480-1e099388fbfa","Type":"ContainerStarted","Data":"f8bdf3d792605121d9b478ad2c7263884d38d9bce471c9cff6e38f323ad29c08"} Dec 04 16:53:23 crc kubenswrapper[4676]: I1204 16:53:23.279197 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjc9p" event={"ID":"e28e5149-d14b-4abf-ac30-b2b8f71e50ae","Type":"ContainerStarted","Data":"4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60"} Dec 04 16:53:23 crc kubenswrapper[4676]: I1204 16:53:23.302796 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-csmsh" podStartSLOduration=4.460404891 podStartE2EDuration="15.302762933s" podCreationTimestamp="2025-12-04 16:53:08 +0000 UTC" firstStartedPulling="2025-12-04 16:53:11.150938941 +0000 UTC m=+5598.585608808" lastFinishedPulling="2025-12-04 16:53:21.993296993 +0000 UTC m=+5609.427966850" observedRunningTime="2025-12-04 16:53:23.295110787 +0000 UTC m=+5610.729780644" watchObservedRunningTime="2025-12-04 16:53:23.302762933 +0000 UTC m=+5610.737432790" Dec 04 16:53:23 crc kubenswrapper[4676]: I1204 16:53:23.325249 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qjc9p" podStartSLOduration=2.864476742 podStartE2EDuration="12.325219495s" podCreationTimestamp="2025-12-04 16:53:11 +0000 UTC" firstStartedPulling="2025-12-04 16:53:13.176427499 +0000 UTC m=+5600.611097356" lastFinishedPulling="2025-12-04 16:53:22.637170262 +0000 UTC m=+5610.071840109" observedRunningTime="2025-12-04 16:53:23.3239963 +0000 UTC m=+5610.758666157" watchObservedRunningTime="2025-12-04 16:53:23.325219495 +0000 UTC m=+5610.759889342" Dec 04 16:53:28 crc kubenswrapper[4676]: I1204 16:53:28.386752 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:28 crc kubenswrapper[4676]: I1204 16:53:28.387463 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:28 crc kubenswrapper[4676]: I1204 16:53:28.443364 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:29 crc kubenswrapper[4676]: I1204 16:53:29.345115 4676 generic.go:334] "Generic (PLEG): container finished" podID="1728d401-fbd4-470d-8084-deaa0ca6c1b5" containerID="44d20814f0884951383435f51f06966d06d34ad548dc9f3c9cc5a8921d0de952" exitCode=1 Dec 04 16:53:29 crc kubenswrapper[4676]: I1204 16:53:29.345229 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1728d401-fbd4-470d-8084-deaa0ca6c1b5","Type":"ContainerDied","Data":"44d20814f0884951383435f51f06966d06d34ad548dc9f3c9cc5a8921d0de952"} Dec 04 16:53:29 crc kubenswrapper[4676]: I1204 16:53:29.398372 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-csmsh" Dec 04 16:53:29 crc kubenswrapper[4676]: I1204 16:53:29.467594 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-csmsh"] Dec 04 16:53:29 crc kubenswrapper[4676]: I1204 16:53:29.513538 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dd8bp"] Dec 04 16:53:29 crc kubenswrapper[4676]: I1204 16:53:29.513804 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dd8bp" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="registry-server" containerID="cri-o://5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4" gracePeriod=2 Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.060077 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.207794 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-catalog-content\") pod \"28063a31-4486-49db-9562-331dec0a5349\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.207946 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82rlc\" (UniqueName: \"kubernetes.io/projected/28063a31-4486-49db-9562-331dec0a5349-kube-api-access-82rlc\") pod \"28063a31-4486-49db-9562-331dec0a5349\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.208110 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-utilities\") pod \"28063a31-4486-49db-9562-331dec0a5349\" (UID: \"28063a31-4486-49db-9562-331dec0a5349\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.208614 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-utilities" (OuterVolumeSpecName: "utilities") pod "28063a31-4486-49db-9562-331dec0a5349" (UID: "28063a31-4486-49db-9562-331dec0a5349"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.214810 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28063a31-4486-49db-9562-331dec0a5349-kube-api-access-82rlc" (OuterVolumeSpecName: "kube-api-access-82rlc") pod "28063a31-4486-49db-9562-331dec0a5349" (UID: "28063a31-4486-49db-9562-331dec0a5349"). InnerVolumeSpecName "kube-api-access-82rlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.256640 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28063a31-4486-49db-9562-331dec0a5349" (UID: "28063a31-4486-49db-9562-331dec0a5349"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.310975 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82rlc\" (UniqueName: \"kubernetes.io/projected/28063a31-4486-49db-9562-331dec0a5349-kube-api-access-82rlc\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.311006 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.311017 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28063a31-4486-49db-9562-331dec0a5349-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.358859 4676 generic.go:334] "Generic (PLEG): container finished" podID="28063a31-4486-49db-9562-331dec0a5349" containerID="5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4" exitCode=0 Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.358965 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dd8bp" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.359004 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd8bp" event={"ID":"28063a31-4486-49db-9562-331dec0a5349","Type":"ContainerDied","Data":"5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4"} Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.359073 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dd8bp" event={"ID":"28063a31-4486-49db-9562-331dec0a5349","Type":"ContainerDied","Data":"8929fa4841cb1dcc66190698cadb802e4621343fda8906ee4287f44332116c66"} Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.359095 4676 scope.go:117] "RemoveContainer" containerID="5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.418881 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dd8bp"] Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.427308 4676 scope.go:117] "RemoveContainer" containerID="80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.431151 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dd8bp"] Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.471484 4676 scope.go:117] "RemoveContainer" containerID="d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.542952 4676 scope.go:117] "RemoveContainer" containerID="5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4" Dec 04 16:53:30 crc kubenswrapper[4676]: E1204 16:53:30.546568 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4\": container with ID starting with 5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4 not found: ID does not exist" containerID="5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.546610 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4"} err="failed to get container status \"5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4\": rpc error: code = NotFound desc = could not find container \"5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4\": container with ID starting with 5c57031f68d7fac8251c239d018cd7d8c25cd40961eca03974e1de0f43a385e4 not found: ID does not exist" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.546644 4676 scope.go:117] "RemoveContainer" containerID="80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e" Dec 04 16:53:30 crc kubenswrapper[4676]: E1204 16:53:30.548748 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e\": container with ID starting with 80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e not found: ID does not exist" containerID="80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.548819 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e"} err="failed to get container status \"80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e\": rpc error: code = NotFound desc = could not find container \"80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e\": container with ID starting with 80a9b4031f2c8fb073c5621036c12578e8cdd281b17f1097bec88c8cb5fd160e not found: ID does not exist" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.548860 4676 scope.go:117] "RemoveContainer" containerID="d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2" Dec 04 16:53:30 crc kubenswrapper[4676]: E1204 16:53:30.549348 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2\": container with ID starting with d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2 not found: ID does not exist" containerID="d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.549377 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2"} err="failed to get container status \"d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2\": rpc error: code = NotFound desc = could not find container \"d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2\": container with ID starting with d67e7f885ea2960bbd5b6699d14df9aecac5160133c5a52678d0869a4fea6af2 not found: ID does not exist" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.637866 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.723653 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.723724 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-temporary\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.723800 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k85c\" (UniqueName: \"kubernetes.io/projected/1728d401-fbd4-470d-8084-deaa0ca6c1b5-kube-api-access-8k85c\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.723840 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config-secret\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.723860 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-config-data\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.723890 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.725106 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ca-certs\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.725155 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-workdir\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.725178 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ssh-key\") pod \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\" (UID: \"1728d401-fbd4-470d-8084-deaa0ca6c1b5\") " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.727433 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-config-data" (OuterVolumeSpecName: "config-data") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.727710 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.736207 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.736954 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.737539 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1728d401-fbd4-470d-8084-deaa0ca6c1b5-kube-api-access-8k85c" (OuterVolumeSpecName: "kube-api-access-8k85c") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "kube-api-access-8k85c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.792442 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.803642 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.827978 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.828039 4676 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.828053 4676 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.828066 4676 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.828084 4676 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.828097 4676 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1728d401-fbd4-470d-8084-deaa0ca6c1b5-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.828112 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k85c\" (UniqueName: \"kubernetes.io/projected/1728d401-fbd4-470d-8084-deaa0ca6c1b5-kube-api-access-8k85c\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.834241 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.870765 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1728d401-fbd4-470d-8084-deaa0ca6c1b5" (UID: "1728d401-fbd4-470d-8084-deaa0ca6c1b5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.880034 4676 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.930577 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.930617 4676 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:30 crc kubenswrapper[4676]: I1204 16:53:30.930628 4676 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1728d401-fbd4-470d-8084-deaa0ca6c1b5-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:31 crc kubenswrapper[4676]: I1204 16:53:31.387634 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 04 16:53:31 crc kubenswrapper[4676]: I1204 16:53:31.398925 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28063a31-4486-49db-9562-331dec0a5349" path="/var/lib/kubelet/pods/28063a31-4486-49db-9562-331dec0a5349/volumes" Dec 04 16:53:31 crc kubenswrapper[4676]: I1204 16:53:31.400354 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1728d401-fbd4-470d-8084-deaa0ca6c1b5","Type":"ContainerDied","Data":"6a3fec7f331a1db3dcb21161964bcf4dd921596dc4781869fc43f6490acf3d00"} Dec 04 16:53:31 crc kubenswrapper[4676]: I1204 16:53:31.400411 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3fec7f331a1db3dcb21161964bcf4dd921596dc4781869fc43f6490acf3d00" Dec 04 16:53:31 crc kubenswrapper[4676]: I1204 16:53:31.867984 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:31 crc kubenswrapper[4676]: I1204 16:53:31.868050 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:31 crc kubenswrapper[4676]: I1204 16:53:31.915519 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:32 crc kubenswrapper[4676]: I1204 16:53:32.451970 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:33 crc kubenswrapper[4676]: I1204 16:53:33.903775 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjc9p"] Dec 04 16:53:34 crc kubenswrapper[4676]: I1204 16:53:34.416022 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qjc9p" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="registry-server" containerID="cri-o://4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60" gracePeriod=2 Dec 04 16:53:34 crc kubenswrapper[4676]: I1204 16:53:34.893531 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.041245 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-catalog-content\") pod \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.041429 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9gj8\" (UniqueName: \"kubernetes.io/projected/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-kube-api-access-g9gj8\") pod \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.041518 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-utilities\") pod \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\" (UID: \"e28e5149-d14b-4abf-ac30-b2b8f71e50ae\") " Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.043010 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-utilities" (OuterVolumeSpecName: "utilities") pod "e28e5149-d14b-4abf-ac30-b2b8f71e50ae" (UID: "e28e5149-d14b-4abf-ac30-b2b8f71e50ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.051235 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-kube-api-access-g9gj8" (OuterVolumeSpecName: "kube-api-access-g9gj8") pod "e28e5149-d14b-4abf-ac30-b2b8f71e50ae" (UID: "e28e5149-d14b-4abf-ac30-b2b8f71e50ae"). InnerVolumeSpecName "kube-api-access-g9gj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.066577 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e28e5149-d14b-4abf-ac30-b2b8f71e50ae" (UID: "e28e5149-d14b-4abf-ac30-b2b8f71e50ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.144947 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.144986 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.145003 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9gj8\" (UniqueName: \"kubernetes.io/projected/e28e5149-d14b-4abf-ac30-b2b8f71e50ae-kube-api-access-g9gj8\") on node \"crc\" DevicePath \"\"" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.432187 4676 generic.go:334] "Generic (PLEG): container finished" podID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerID="4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60" exitCode=0 Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.432246 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjc9p" event={"ID":"e28e5149-d14b-4abf-ac30-b2b8f71e50ae","Type":"ContainerDied","Data":"4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60"} Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.432276 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qjc9p" event={"ID":"e28e5149-d14b-4abf-ac30-b2b8f71e50ae","Type":"ContainerDied","Data":"5c2fdb789410079a13cbac63f1d2355fd7396a34434301614768158da2e1e5ea"} Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.432277 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qjc9p" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.432293 4676 scope.go:117] "RemoveContainer" containerID="4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.461213 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjc9p"] Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.463483 4676 scope.go:117] "RemoveContainer" containerID="d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.471448 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qjc9p"] Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.488040 4676 scope.go:117] "RemoveContainer" containerID="c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.535751 4676 scope.go:117] "RemoveContainer" containerID="4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60" Dec 04 16:53:35 crc kubenswrapper[4676]: E1204 16:53:35.536520 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60\": container with ID starting with 4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60 not found: ID does not exist" containerID="4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.536568 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60"} err="failed to get container status \"4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60\": rpc error: code = NotFound desc = could not find container \"4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60\": container with ID starting with 4adc015b98e83451bd4293ab46ad4948bb81794d2c0af72c0c0d73f907e30e60 not found: ID does not exist" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.536596 4676 scope.go:117] "RemoveContainer" containerID="d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000" Dec 04 16:53:35 crc kubenswrapper[4676]: E1204 16:53:35.536927 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000\": container with ID starting with d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000 not found: ID does not exist" containerID="d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.536970 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000"} err="failed to get container status \"d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000\": rpc error: code = NotFound desc = could not find container \"d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000\": container with ID starting with d5152fecf0d2c39ef8a9010ed95831e95404c9981cf0ce27b15b6dab7f36d000 not found: ID does not exist" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.536999 4676 scope.go:117] "RemoveContainer" containerID="c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af" Dec 04 16:53:35 crc kubenswrapper[4676]: E1204 16:53:35.537263 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af\": container with ID starting with c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af not found: ID does not exist" containerID="c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af" Dec 04 16:53:35 crc kubenswrapper[4676]: I1204 16:53:35.537300 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af"} err="failed to get container status \"c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af\": rpc error: code = NotFound desc = could not find container \"c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af\": container with ID starting with c3ec9bdf9c4f2b4d43ba8782f9c16533d1e12e3a1562339e483cae8ed55858af not found: ID does not exist" Dec 04 16:53:37 crc kubenswrapper[4676]: I1204 16:53:37.398501 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" path="/var/lib/kubelet/pods/e28e5149-d14b-4abf-ac30-b2b8f71e50ae/volumes" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.793414 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 16:53:40 crc kubenswrapper[4676]: E1204 16:53:40.794339 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1728d401-fbd4-470d-8084-deaa0ca6c1b5" containerName="tempest-tests-tempest-tests-runner" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794355 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="1728d401-fbd4-470d-8084-deaa0ca6c1b5" containerName="tempest-tests-tempest-tests-runner" Dec 04 16:53:40 crc kubenswrapper[4676]: E1204 16:53:40.794380 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="extract-content" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794386 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="extract-content" Dec 04 16:53:40 crc kubenswrapper[4676]: E1204 16:53:40.794408 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="extract-utilities" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794415 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="extract-utilities" Dec 04 16:53:40 crc kubenswrapper[4676]: E1204 16:53:40.794440 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="extract-utilities" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794447 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="extract-utilities" Dec 04 16:53:40 crc kubenswrapper[4676]: E1204 16:53:40.794465 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="extract-content" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794471 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="extract-content" Dec 04 16:53:40 crc kubenswrapper[4676]: E1204 16:53:40.794486 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="registry-server" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794497 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="registry-server" Dec 04 16:53:40 crc kubenswrapper[4676]: E1204 16:53:40.794512 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="registry-server" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794524 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="registry-server" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794746 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="e28e5149-d14b-4abf-ac30-b2b8f71e50ae" containerName="registry-server" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794772 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="1728d401-fbd4-470d-8084-deaa0ca6c1b5" containerName="tempest-tests-tempest-tests-runner" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.794790 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="28063a31-4486-49db-9562-331dec0a5349" containerName="registry-server" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.795748 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.798178 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-4zwdj" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.815107 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.871647 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m2dv\" (UniqueName: \"kubernetes.io/projected/2db24e9d-bcf8-4e11-8823-5709bb13d99d-kube-api-access-6m2dv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2db24e9d-bcf8-4e11-8823-5709bb13d99d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.871856 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2db24e9d-bcf8-4e11-8823-5709bb13d99d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.973612 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m2dv\" (UniqueName: \"kubernetes.io/projected/2db24e9d-bcf8-4e11-8823-5709bb13d99d-kube-api-access-6m2dv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2db24e9d-bcf8-4e11-8823-5709bb13d99d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.973834 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2db24e9d-bcf8-4e11-8823-5709bb13d99d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.974525 4676 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2db24e9d-bcf8-4e11-8823-5709bb13d99d\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:40 crc kubenswrapper[4676]: I1204 16:53:40.993293 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m2dv\" (UniqueName: \"kubernetes.io/projected/2db24e9d-bcf8-4e11-8823-5709bb13d99d-kube-api-access-6m2dv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2db24e9d-bcf8-4e11-8823-5709bb13d99d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:41 crc kubenswrapper[4676]: I1204 16:53:41.006131 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2db24e9d-bcf8-4e11-8823-5709bb13d99d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:41 crc kubenswrapper[4676]: I1204 16:53:41.120421 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 04 16:53:41 crc kubenswrapper[4676]: I1204 16:53:41.666579 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 04 16:53:41 crc kubenswrapper[4676]: W1204 16:53:41.675668 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db24e9d_bcf8_4e11_8823_5709bb13d99d.slice/crio-a62ce9c12be02270f0311f785fc5602b802c814714133c2e36fe7c0ce597780c WatchSource:0}: Error finding container a62ce9c12be02270f0311f785fc5602b802c814714133c2e36fe7c0ce597780c: Status 404 returned error can't find the container with id a62ce9c12be02270f0311f785fc5602b802c814714133c2e36fe7c0ce597780c Dec 04 16:53:42 crc kubenswrapper[4676]: I1204 16:53:42.513521 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2db24e9d-bcf8-4e11-8823-5709bb13d99d","Type":"ContainerStarted","Data":"a62ce9c12be02270f0311f785fc5602b802c814714133c2e36fe7c0ce597780c"} Dec 04 16:53:43 crc kubenswrapper[4676]: I1204 16:53:43.526408 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2db24e9d-bcf8-4e11-8823-5709bb13d99d","Type":"ContainerStarted","Data":"755b803a0be4ef28ff687ba1d034ca73d67f16044e987c444fa7b320104f2292"} Dec 04 16:53:43 crc kubenswrapper[4676]: I1204 16:53:43.547950 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.138810851 podStartE2EDuration="3.547926445s" podCreationTimestamp="2025-12-04 16:53:40 +0000 UTC" firstStartedPulling="2025-12-04 16:53:41.678256831 +0000 UTC m=+5629.112926688" lastFinishedPulling="2025-12-04 16:53:43.087372425 +0000 UTC m=+5630.522042282" observedRunningTime="2025-12-04 16:53:43.545955079 +0000 UTC m=+5630.980624936" watchObservedRunningTime="2025-12-04 16:53:43.547926445 +0000 UTC m=+5630.982596302" Dec 04 16:53:46 crc kubenswrapper[4676]: I1204 16:53:46.026771 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:53:46 crc kubenswrapper[4676]: I1204 16:53:46.027525 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.418788 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99fmm/must-gather-7bh8h"] Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.421225 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.426580 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-99fmm"/"openshift-service-ca.crt" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.426891 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-99fmm"/"default-dockercfg-jsjmh" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.429172 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-99fmm"/"kube-root-ca.crt" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.434497 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q486r\" (UniqueName: \"kubernetes.io/projected/76d769e1-ed6f-4192-bee8-d36d31249051-kube-api-access-q486r\") pod \"must-gather-7bh8h\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.434581 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d769e1-ed6f-4192-bee8-d36d31249051-must-gather-output\") pod \"must-gather-7bh8h\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.439824 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-99fmm/must-gather-7bh8h"] Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.536397 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q486r\" (UniqueName: \"kubernetes.io/projected/76d769e1-ed6f-4192-bee8-d36d31249051-kube-api-access-q486r\") pod \"must-gather-7bh8h\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.536455 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d769e1-ed6f-4192-bee8-d36d31249051-must-gather-output\") pod \"must-gather-7bh8h\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.537034 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d769e1-ed6f-4192-bee8-d36d31249051-must-gather-output\") pod \"must-gather-7bh8h\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.554101 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q486r\" (UniqueName: \"kubernetes.io/projected/76d769e1-ed6f-4192-bee8-d36d31249051-kube-api-access-q486r\") pod \"must-gather-7bh8h\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:14 crc kubenswrapper[4676]: I1204 16:54:14.745253 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 16:54:15 crc kubenswrapper[4676]: I1204 16:54:15.289374 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-99fmm/must-gather-7bh8h"] Dec 04 16:54:15 crc kubenswrapper[4676]: I1204 16:54:15.292304 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 16:54:15 crc kubenswrapper[4676]: I1204 16:54:15.966865 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/must-gather-7bh8h" event={"ID":"76d769e1-ed6f-4192-bee8-d36d31249051","Type":"ContainerStarted","Data":"01f9b3febc38bb240c07122c79c1fab88e6cde2da1bb445ac800f9b1842b28ad"} Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.027451 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.027523 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.027575 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.028590 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.028660 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" gracePeriod=600 Dec 04 16:54:16 crc kubenswrapper[4676]: E1204 16:54:16.305137 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.984421 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" exitCode=0 Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.984476 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c"} Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.984514 4676 scope.go:117] "RemoveContainer" containerID="2be211197532ccc2ed17c6a9af3cfe8084e22ef83a0ba97237e594e45e560a82" Dec 04 16:54:16 crc kubenswrapper[4676]: I1204 16:54:16.985286 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:54:16 crc kubenswrapper[4676]: E1204 16:54:16.985578 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:54:25 crc kubenswrapper[4676]: I1204 16:54:25.076658 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/must-gather-7bh8h" event={"ID":"76d769e1-ed6f-4192-bee8-d36d31249051","Type":"ContainerStarted","Data":"c9e5428563fb1411cf59b64f42b8df2dfd17924001cfad837208a417370ff854"} Dec 04 16:54:26 crc kubenswrapper[4676]: I1204 16:54:26.088376 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/must-gather-7bh8h" event={"ID":"76d769e1-ed6f-4192-bee8-d36d31249051","Type":"ContainerStarted","Data":"ed34e5bc679f22102152d411699172f301b47fd29bf87582b434113fd0617af7"} Dec 04 16:54:26 crc kubenswrapper[4676]: I1204 16:54:26.116307 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-99fmm/must-gather-7bh8h" podStartSLOduration=3.148091357 podStartE2EDuration="12.116288048s" podCreationTimestamp="2025-12-04 16:54:14 +0000 UTC" firstStartedPulling="2025-12-04 16:54:15.291877092 +0000 UTC m=+5662.726546949" lastFinishedPulling="2025-12-04 16:54:24.260073783 +0000 UTC m=+5671.694743640" observedRunningTime="2025-12-04 16:54:26.104792905 +0000 UTC m=+5673.539462762" watchObservedRunningTime="2025-12-04 16:54:26.116288048 +0000 UTC m=+5673.550957905" Dec 04 16:54:29 crc kubenswrapper[4676]: I1204 16:54:29.384973 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:54:29 crc kubenswrapper[4676]: E1204 16:54:29.385888 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:54:30 crc kubenswrapper[4676]: E1204 16:54:30.304193 4676 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.158:53646->38.102.83.158:40877: read tcp 38.102.83.158:53646->38.102.83.158:40877: read: connection reset by peer Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.490014 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99fmm/crc-debug-fw9h6"] Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.491824 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.536474 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgv92\" (UniqueName: \"kubernetes.io/projected/5e2507fc-140b-4056-89a1-ea83aad1620b-kube-api-access-zgv92\") pod \"crc-debug-fw9h6\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.536831 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2507fc-140b-4056-89a1-ea83aad1620b-host\") pod \"crc-debug-fw9h6\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.639400 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgv92\" (UniqueName: \"kubernetes.io/projected/5e2507fc-140b-4056-89a1-ea83aad1620b-kube-api-access-zgv92\") pod \"crc-debug-fw9h6\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.639497 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2507fc-140b-4056-89a1-ea83aad1620b-host\") pod \"crc-debug-fw9h6\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.639705 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2507fc-140b-4056-89a1-ea83aad1620b-host\") pod \"crc-debug-fw9h6\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.660690 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgv92\" (UniqueName: \"kubernetes.io/projected/5e2507fc-140b-4056-89a1-ea83aad1620b-kube-api-access-zgv92\") pod \"crc-debug-fw9h6\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:31 crc kubenswrapper[4676]: I1204 16:54:31.815985 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:54:32 crc kubenswrapper[4676]: I1204 16:54:32.147131 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" event={"ID":"5e2507fc-140b-4056-89a1-ea83aad1620b","Type":"ContainerStarted","Data":"36169f6d3476d93e619ae043b5da935269e4b6a38c623a9dedf71249574b076f"} Dec 04 16:54:40 crc kubenswrapper[4676]: I1204 16:54:40.385390 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:54:40 crc kubenswrapper[4676]: E1204 16:54:40.386534 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:54:51 crc kubenswrapper[4676]: I1204 16:54:51.384035 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:54:51 crc kubenswrapper[4676]: E1204 16:54:51.385310 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:54:57 crc kubenswrapper[4676]: E1204 16:54:57.880117 4676 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Dec 04 16:54:57 crc kubenswrapper[4676]: E1204 16:54:57.880805 4676 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zgv92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-fw9h6_openshift-must-gather-99fmm(5e2507fc-140b-4056-89a1-ea83aad1620b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 04 16:54:57 crc kubenswrapper[4676]: E1204 16:54:57.882037 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" podUID="5e2507fc-140b-4056-89a1-ea83aad1620b" Dec 04 16:54:58 crc kubenswrapper[4676]: E1204 16:54:58.443717 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" podUID="5e2507fc-140b-4056-89a1-ea83aad1620b" Dec 04 16:55:05 crc kubenswrapper[4676]: I1204 16:55:05.384553 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:55:05 crc kubenswrapper[4676]: E1204 16:55:05.385440 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:55:16 crc kubenswrapper[4676]: I1204 16:55:16.626324 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" event={"ID":"5e2507fc-140b-4056-89a1-ea83aad1620b","Type":"ContainerStarted","Data":"58e48defa5d5b71438c3f2382d2e1180f6ded91aca0510db43478fd5097ae70a"} Dec 04 16:55:16 crc kubenswrapper[4676]: I1204 16:55:16.643734 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" podStartSLOduration=1.762223109 podStartE2EDuration="45.643713054s" podCreationTimestamp="2025-12-04 16:54:31 +0000 UTC" firstStartedPulling="2025-12-04 16:54:31.864096135 +0000 UTC m=+5679.298765992" lastFinishedPulling="2025-12-04 16:55:15.74558608 +0000 UTC m=+5723.180255937" observedRunningTime="2025-12-04 16:55:16.640339749 +0000 UTC m=+5724.075009616" watchObservedRunningTime="2025-12-04 16:55:16.643713054 +0000 UTC m=+5724.078382911" Dec 04 16:55:18 crc kubenswrapper[4676]: I1204 16:55:18.383886 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:55:18 crc kubenswrapper[4676]: E1204 16:55:18.384905 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:55:29 crc kubenswrapper[4676]: I1204 16:55:29.385055 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:55:29 crc kubenswrapper[4676]: E1204 16:55:29.386129 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:55:44 crc kubenswrapper[4676]: I1204 16:55:44.385202 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:55:44 crc kubenswrapper[4676]: E1204 16:55:44.385976 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:55:55 crc kubenswrapper[4676]: I1204 16:55:55.384709 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:55:55 crc kubenswrapper[4676]: E1204 16:55:55.385618 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:56:04 crc kubenswrapper[4676]: I1204 16:56:04.139791 4676 scope.go:117] "RemoveContainer" containerID="72baf09e1329af14761b0bd232815bb08bc2890df36472740b9f2da58a541f32" Dec 04 16:56:04 crc kubenswrapper[4676]: I1204 16:56:04.172683 4676 scope.go:117] "RemoveContainer" containerID="f49776e029db8e17111cb994898f805938786aafe1e925cbbeb14899bfe70b08" Dec 04 16:56:04 crc kubenswrapper[4676]: I1204 16:56:04.220680 4676 scope.go:117] "RemoveContainer" containerID="08cc70531aae29489f138d2c47935bb5815388a9d2663e525946ec26f2186b9c" Dec 04 16:56:09 crc kubenswrapper[4676]: I1204 16:56:09.385661 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:56:09 crc kubenswrapper[4676]: E1204 16:56:09.386492 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:56:23 crc kubenswrapper[4676]: I1204 16:56:23.395034 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:56:23 crc kubenswrapper[4676]: E1204 16:56:23.396657 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:56:35 crc kubenswrapper[4676]: I1204 16:56:35.387050 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:56:35 crc kubenswrapper[4676]: E1204 16:56:35.387721 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:56:36 crc kubenswrapper[4676]: I1204 16:56:36.424295 4676 generic.go:334] "Generic (PLEG): container finished" podID="5e2507fc-140b-4056-89a1-ea83aad1620b" containerID="58e48defa5d5b71438c3f2382d2e1180f6ded91aca0510db43478fd5097ae70a" exitCode=0 Dec 04 16:56:36 crc kubenswrapper[4676]: I1204 16:56:36.424373 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" event={"ID":"5e2507fc-140b-4056-89a1-ea83aad1620b","Type":"ContainerDied","Data":"58e48defa5d5b71438c3f2382d2e1180f6ded91aca0510db43478fd5097ae70a"} Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.550612 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.584185 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99fmm/crc-debug-fw9h6"] Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.592690 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99fmm/crc-debug-fw9h6"] Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.681492 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgv92\" (UniqueName: \"kubernetes.io/projected/5e2507fc-140b-4056-89a1-ea83aad1620b-kube-api-access-zgv92\") pod \"5e2507fc-140b-4056-89a1-ea83aad1620b\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.681979 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2507fc-140b-4056-89a1-ea83aad1620b-host\") pod \"5e2507fc-140b-4056-89a1-ea83aad1620b\" (UID: \"5e2507fc-140b-4056-89a1-ea83aad1620b\") " Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.682058 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e2507fc-140b-4056-89a1-ea83aad1620b-host" (OuterVolumeSpecName: "host") pod "5e2507fc-140b-4056-89a1-ea83aad1620b" (UID: "5e2507fc-140b-4056-89a1-ea83aad1620b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.683046 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2507fc-140b-4056-89a1-ea83aad1620b-host\") on node \"crc\" DevicePath \"\"" Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.688838 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2507fc-140b-4056-89a1-ea83aad1620b-kube-api-access-zgv92" (OuterVolumeSpecName: "kube-api-access-zgv92") pod "5e2507fc-140b-4056-89a1-ea83aad1620b" (UID: "5e2507fc-140b-4056-89a1-ea83aad1620b"). InnerVolumeSpecName "kube-api-access-zgv92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:56:37 crc kubenswrapper[4676]: I1204 16:56:37.785668 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgv92\" (UniqueName: \"kubernetes.io/projected/5e2507fc-140b-4056-89a1-ea83aad1620b-kube-api-access-zgv92\") on node \"crc\" DevicePath \"\"" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.451746 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36169f6d3476d93e619ae043b5da935269e4b6a38c623a9dedf71249574b076f" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.452081 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-fw9h6" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.780013 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99fmm/crc-debug-v92kv"] Dec 04 16:56:38 crc kubenswrapper[4676]: E1204 16:56:38.780566 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2507fc-140b-4056-89a1-ea83aad1620b" containerName="container-00" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.780583 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2507fc-140b-4056-89a1-ea83aad1620b" containerName="container-00" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.780919 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2507fc-140b-4056-89a1-ea83aad1620b" containerName="container-00" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.781835 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.907944 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npp5h\" (UniqueName: \"kubernetes.io/projected/bfdb10d5-951a-4769-816f-aac66f58a462-kube-api-access-npp5h\") pod \"crc-debug-v92kv\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:38 crc kubenswrapper[4676]: I1204 16:56:38.908058 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdb10d5-951a-4769-816f-aac66f58a462-host\") pod \"crc-debug-v92kv\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.010516 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npp5h\" (UniqueName: \"kubernetes.io/projected/bfdb10d5-951a-4769-816f-aac66f58a462-kube-api-access-npp5h\") pod \"crc-debug-v92kv\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.010662 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdb10d5-951a-4769-816f-aac66f58a462-host\") pod \"crc-debug-v92kv\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.010843 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdb10d5-951a-4769-816f-aac66f58a462-host\") pod \"crc-debug-v92kv\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.039246 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npp5h\" (UniqueName: \"kubernetes.io/projected/bfdb10d5-951a-4769-816f-aac66f58a462-kube-api-access-npp5h\") pod \"crc-debug-v92kv\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.103623 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:39 crc kubenswrapper[4676]: W1204 16:56:39.158678 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfdb10d5_951a_4769_816f_aac66f58a462.slice/crio-0e65be87e88fa0405b571a5384ef1634424f172da8217e2c8113b0ca6ce77a5a WatchSource:0}: Error finding container 0e65be87e88fa0405b571a5384ef1634424f172da8217e2c8113b0ca6ce77a5a: Status 404 returned error can't find the container with id 0e65be87e88fa0405b571a5384ef1634424f172da8217e2c8113b0ca6ce77a5a Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.415087 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e2507fc-140b-4056-89a1-ea83aad1620b" path="/var/lib/kubelet/pods/5e2507fc-140b-4056-89a1-ea83aad1620b/volumes" Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.460556 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-v92kv" event={"ID":"bfdb10d5-951a-4769-816f-aac66f58a462","Type":"ContainerStarted","Data":"78a96f6e959007c020442e2b51fa3f90ff10af3a64b87eab75578b2e5c0209ff"} Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.460604 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-v92kv" event={"ID":"bfdb10d5-951a-4769-816f-aac66f58a462","Type":"ContainerStarted","Data":"0e65be87e88fa0405b571a5384ef1634424f172da8217e2c8113b0ca6ce77a5a"} Dec 04 16:56:39 crc kubenswrapper[4676]: I1204 16:56:39.486697 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-99fmm/crc-debug-v92kv" podStartSLOduration=1.486673342 podStartE2EDuration="1.486673342s" podCreationTimestamp="2025-12-04 16:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 16:56:39.473612075 +0000 UTC m=+5806.908281942" watchObservedRunningTime="2025-12-04 16:56:39.486673342 +0000 UTC m=+5806.921343189" Dec 04 16:56:40 crc kubenswrapper[4676]: I1204 16:56:40.471141 4676 generic.go:334] "Generic (PLEG): container finished" podID="bfdb10d5-951a-4769-816f-aac66f58a462" containerID="78a96f6e959007c020442e2b51fa3f90ff10af3a64b87eab75578b2e5c0209ff" exitCode=0 Dec 04 16:56:40 crc kubenswrapper[4676]: I1204 16:56:40.471339 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-v92kv" event={"ID":"bfdb10d5-951a-4769-816f-aac66f58a462","Type":"ContainerDied","Data":"78a96f6e959007c020442e2b51fa3f90ff10af3a64b87eab75578b2e5c0209ff"} Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.582386 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.656766 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npp5h\" (UniqueName: \"kubernetes.io/projected/bfdb10d5-951a-4769-816f-aac66f58a462-kube-api-access-npp5h\") pod \"bfdb10d5-951a-4769-816f-aac66f58a462\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.656845 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdb10d5-951a-4769-816f-aac66f58a462-host\") pod \"bfdb10d5-951a-4769-816f-aac66f58a462\" (UID: \"bfdb10d5-951a-4769-816f-aac66f58a462\") " Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.657580 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfdb10d5-951a-4769-816f-aac66f58a462-host" (OuterVolumeSpecName: "host") pod "bfdb10d5-951a-4769-816f-aac66f58a462" (UID: "bfdb10d5-951a-4769-816f-aac66f58a462"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.671925 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfdb10d5-951a-4769-816f-aac66f58a462-kube-api-access-npp5h" (OuterVolumeSpecName: "kube-api-access-npp5h") pod "bfdb10d5-951a-4769-816f-aac66f58a462" (UID: "bfdb10d5-951a-4769-816f-aac66f58a462"). InnerVolumeSpecName "kube-api-access-npp5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.760314 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npp5h\" (UniqueName: \"kubernetes.io/projected/bfdb10d5-951a-4769-816f-aac66f58a462-kube-api-access-npp5h\") on node \"crc\" DevicePath \"\"" Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.760464 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bfdb10d5-951a-4769-816f-aac66f58a462-host\") on node \"crc\" DevicePath \"\"" Dec 04 16:56:41 crc kubenswrapper[4676]: I1204 16:56:41.995300 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99fmm/crc-debug-v92kv"] Dec 04 16:56:42 crc kubenswrapper[4676]: I1204 16:56:42.012393 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99fmm/crc-debug-v92kv"] Dec 04 16:56:42 crc kubenswrapper[4676]: I1204 16:56:42.496181 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e65be87e88fa0405b571a5384ef1634424f172da8217e2c8113b0ca6ce77a5a" Dec 04 16:56:42 crc kubenswrapper[4676]: I1204 16:56:42.496563 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-v92kv" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.211096 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-99fmm/crc-debug-qpftd"] Dec 04 16:56:43 crc kubenswrapper[4676]: E1204 16:56:43.211612 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdb10d5-951a-4769-816f-aac66f58a462" containerName="container-00" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.211629 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdb10d5-951a-4769-816f-aac66f58a462" containerName="container-00" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.211855 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfdb10d5-951a-4769-816f-aac66f58a462" containerName="container-00" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.212637 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.293178 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92cxl\" (UniqueName: \"kubernetes.io/projected/a3f02b46-8df8-40a5-89b5-72b94a15c519-kube-api-access-92cxl\") pod \"crc-debug-qpftd\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.293529 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3f02b46-8df8-40a5-89b5-72b94a15c519-host\") pod \"crc-debug-qpftd\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.395169 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3f02b46-8df8-40a5-89b5-72b94a15c519-host\") pod \"crc-debug-qpftd\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.395358 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3f02b46-8df8-40a5-89b5-72b94a15c519-host\") pod \"crc-debug-qpftd\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.395680 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92cxl\" (UniqueName: \"kubernetes.io/projected/a3f02b46-8df8-40a5-89b5-72b94a15c519-kube-api-access-92cxl\") pod \"crc-debug-qpftd\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.401801 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfdb10d5-951a-4769-816f-aac66f58a462" path="/var/lib/kubelet/pods/bfdb10d5-951a-4769-816f-aac66f58a462/volumes" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.420365 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92cxl\" (UniqueName: \"kubernetes.io/projected/a3f02b46-8df8-40a5-89b5-72b94a15c519-kube-api-access-92cxl\") pod \"crc-debug-qpftd\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: I1204 16:56:43.529822 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:43 crc kubenswrapper[4676]: W1204 16:56:43.573730 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3f02b46_8df8_40a5_89b5_72b94a15c519.slice/crio-d58d49039ddba0c4c0b79bdd255fc8f78a125b64a061764a2f0954adcb91ea79 WatchSource:0}: Error finding container d58d49039ddba0c4c0b79bdd255fc8f78a125b64a061764a2f0954adcb91ea79: Status 404 returned error can't find the container with id d58d49039ddba0c4c0b79bdd255fc8f78a125b64a061764a2f0954adcb91ea79 Dec 04 16:56:44 crc kubenswrapper[4676]: I1204 16:56:44.519221 4676 generic.go:334] "Generic (PLEG): container finished" podID="a3f02b46-8df8-40a5-89b5-72b94a15c519" containerID="c1458b8fdc86fadd400a830e6a75867b9ad0d646ae192f18062184dad8e277cb" exitCode=0 Dec 04 16:56:44 crc kubenswrapper[4676]: I1204 16:56:44.519324 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-qpftd" event={"ID":"a3f02b46-8df8-40a5-89b5-72b94a15c519","Type":"ContainerDied","Data":"c1458b8fdc86fadd400a830e6a75867b9ad0d646ae192f18062184dad8e277cb"} Dec 04 16:56:44 crc kubenswrapper[4676]: I1204 16:56:44.519646 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/crc-debug-qpftd" event={"ID":"a3f02b46-8df8-40a5-89b5-72b94a15c519","Type":"ContainerStarted","Data":"d58d49039ddba0c4c0b79bdd255fc8f78a125b64a061764a2f0954adcb91ea79"} Dec 04 16:56:44 crc kubenswrapper[4676]: I1204 16:56:44.568565 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99fmm/crc-debug-qpftd"] Dec 04 16:56:44 crc kubenswrapper[4676]: I1204 16:56:44.580965 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99fmm/crc-debug-qpftd"] Dec 04 16:56:45 crc kubenswrapper[4676]: I1204 16:56:45.642849 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:45 crc kubenswrapper[4676]: I1204 16:56:45.745779 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92cxl\" (UniqueName: \"kubernetes.io/projected/a3f02b46-8df8-40a5-89b5-72b94a15c519-kube-api-access-92cxl\") pod \"a3f02b46-8df8-40a5-89b5-72b94a15c519\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " Dec 04 16:56:45 crc kubenswrapper[4676]: I1204 16:56:45.746055 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3f02b46-8df8-40a5-89b5-72b94a15c519-host\") pod \"a3f02b46-8df8-40a5-89b5-72b94a15c519\" (UID: \"a3f02b46-8df8-40a5-89b5-72b94a15c519\") " Dec 04 16:56:45 crc kubenswrapper[4676]: I1204 16:56:45.746153 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a3f02b46-8df8-40a5-89b5-72b94a15c519-host" (OuterVolumeSpecName: "host") pod "a3f02b46-8df8-40a5-89b5-72b94a15c519" (UID: "a3f02b46-8df8-40a5-89b5-72b94a15c519"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 04 16:56:45 crc kubenswrapper[4676]: I1204 16:56:45.746669 4676 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3f02b46-8df8-40a5-89b5-72b94a15c519-host\") on node \"crc\" DevicePath \"\"" Dec 04 16:56:45 crc kubenswrapper[4676]: I1204 16:56:45.760270 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3f02b46-8df8-40a5-89b5-72b94a15c519-kube-api-access-92cxl" (OuterVolumeSpecName: "kube-api-access-92cxl") pod "a3f02b46-8df8-40a5-89b5-72b94a15c519" (UID: "a3f02b46-8df8-40a5-89b5-72b94a15c519"). InnerVolumeSpecName "kube-api-access-92cxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 16:56:45 crc kubenswrapper[4676]: I1204 16:56:45.848517 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92cxl\" (UniqueName: \"kubernetes.io/projected/a3f02b46-8df8-40a5-89b5-72b94a15c519-kube-api-access-92cxl\") on node \"crc\" DevicePath \"\"" Dec 04 16:56:46 crc kubenswrapper[4676]: I1204 16:56:46.540176 4676 scope.go:117] "RemoveContainer" containerID="c1458b8fdc86fadd400a830e6a75867b9ad0d646ae192f18062184dad8e277cb" Dec 04 16:56:46 crc kubenswrapper[4676]: I1204 16:56:46.540228 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/crc-debug-qpftd" Dec 04 16:56:47 crc kubenswrapper[4676]: I1204 16:56:47.396813 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3f02b46-8df8-40a5-89b5-72b94a15c519" path="/var/lib/kubelet/pods/a3f02b46-8df8-40a5-89b5-72b94a15c519/volumes" Dec 04 16:56:50 crc kubenswrapper[4676]: I1204 16:56:50.385264 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:56:50 crc kubenswrapper[4676]: E1204 16:56:50.386174 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:57:02 crc kubenswrapper[4676]: I1204 16:57:02.384794 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:57:02 crc kubenswrapper[4676]: E1204 16:57:02.385860 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:57:12 crc kubenswrapper[4676]: I1204 16:57:12.584443 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c64fd75cd-msd6p_baa2202e-331f-46d6-b6a8-e7b6484029f6/barbican-api/0.log" Dec 04 16:57:12 crc kubenswrapper[4676]: I1204 16:57:12.700103 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6c64fd75cd-msd6p_baa2202e-331f-46d6-b6a8-e7b6484029f6/barbican-api-log/0.log" Dec 04 16:57:12 crc kubenswrapper[4676]: I1204 16:57:12.790193 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d8d8c7d4-6r94k_cf2c938b-0504-4743-95aa-40338211a37c/barbican-keystone-listener/0.log" Dec 04 16:57:12 crc kubenswrapper[4676]: I1204 16:57:12.874704 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7d8d8c7d4-6r94k_cf2c938b-0504-4743-95aa-40338211a37c/barbican-keystone-listener-log/0.log" Dec 04 16:57:12 crc kubenswrapper[4676]: I1204 16:57:12.999875 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d67df5bf5-pk5hl_8ca5926d-be39-4cda-b11d-bbed877ffa22/barbican-worker/0.log" Dec 04 16:57:13 crc kubenswrapper[4676]: I1204 16:57:13.042116 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d67df5bf5-pk5hl_8ca5926d-be39-4cda-b11d-bbed877ffa22/barbican-worker-log/0.log" Dec 04 16:57:13 crc kubenswrapper[4676]: I1204 16:57:13.461004 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zlcgw_7778f969-2f94-4830-8685-bb42b6a9fd23/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:13 crc kubenswrapper[4676]: I1204 16:57:13.618304 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_920f3ae5-c94b-486c-b387-6774d1e29587/ceilometer-central-agent/0.log" Dec 04 16:57:13 crc kubenswrapper[4676]: I1204 16:57:13.647256 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_920f3ae5-c94b-486c-b387-6774d1e29587/ceilometer-notification-agent/0.log" Dec 04 16:57:13 crc kubenswrapper[4676]: I1204 16:57:13.721549 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_920f3ae5-c94b-486c-b387-6774d1e29587/proxy-httpd/0.log" Dec 04 16:57:13 crc kubenswrapper[4676]: I1204 16:57:13.754339 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_920f3ae5-c94b-486c-b387-6774d1e29587/sg-core/0.log" Dec 04 16:57:13 crc kubenswrapper[4676]: I1204 16:57:13.983710 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d9ccb2a9-3d12-4899-bae2-618d80e5167c/cinder-api-log/0.log" Dec 04 16:57:14 crc kubenswrapper[4676]: I1204 16:57:14.337414 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_4824604f-7b99-455c-be80-b8410dc47264/probe/0.log" Dec 04 16:57:14 crc kubenswrapper[4676]: I1204 16:57:14.384454 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:57:14 crc kubenswrapper[4676]: E1204 16:57:14.384801 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:57:14 crc kubenswrapper[4676]: I1204 16:57:14.543105 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_4824604f-7b99-455c-be80-b8410dc47264/cinder-backup/0.log" Dec 04 16:57:14 crc kubenswrapper[4676]: I1204 16:57:14.571863 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d9ccb2a9-3d12-4899-bae2-618d80e5167c/cinder-api/0.log" Dec 04 16:57:14 crc kubenswrapper[4676]: I1204 16:57:14.611015 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_68ff764e-4045-42f0-83c6-b0ab7a4f3d7d/cinder-scheduler/0.log" Dec 04 16:57:14 crc kubenswrapper[4676]: I1204 16:57:14.653791 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_68ff764e-4045-42f0-83c6-b0ab7a4f3d7d/probe/0.log" Dec 04 16:57:14 crc kubenswrapper[4676]: I1204 16:57:14.935742 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_2edf87ae-1216-4015-9a84-9db0c05f045e/probe/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.111845 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_2edf87ae-1216-4015-9a84-9db0c05f045e/cinder-volume/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.226422 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_9572f37c-801d-4ea4-acfe-4ad3be15946a/probe/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.339559 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-htrzx_1ada8c79-9112-4e01-9e1f-0289338b6191/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.351744 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_9572f37c-801d-4ea4-acfe-4ad3be15946a/cinder-volume/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.539650 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kmlph_fc2720ac-f3d3-4b6e-b00a-dda587b2ad1d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.598558 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b864cb897-lcnmv_7d01e1f6-a481-4501-879f-e099a53f3070/init/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.820964 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b864cb897-lcnmv_7d01e1f6-a481-4501-879f-e099a53f3070/init/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.916399 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-s2cqz_59ed14d8-9b88-49e8-ac61-213b3a6908e7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:15 crc kubenswrapper[4676]: I1204 16:57:15.994454 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6b864cb897-lcnmv_7d01e1f6-a481-4501-879f-e099a53f3070/dnsmasq-dns/0.log" Dec 04 16:57:16 crc kubenswrapper[4676]: I1204 16:57:16.165400 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5a9c189a-a32b-46fc-99ef-c643d9959aa5/glance-log/0.log" Dec 04 16:57:16 crc kubenswrapper[4676]: I1204 16:57:16.167698 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_5a9c189a-a32b-46fc-99ef-c643d9959aa5/glance-httpd/0.log" Dec 04 16:57:16 crc kubenswrapper[4676]: I1204 16:57:16.338035 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_98ccf1a8-b6c5-4f19-af89-531b204e79eb/glance-log/0.log" Dec 04 16:57:16 crc kubenswrapper[4676]: I1204 16:57:16.370570 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_98ccf1a8-b6c5-4f19-af89-531b204e79eb/glance-httpd/0.log" Dec 04 16:57:16 crc kubenswrapper[4676]: I1204 16:57:16.554474 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-74857cd458-nnlq7_062c032e-aef9-4036-8d2b-dc89641ed977/horizon/0.log" Dec 04 16:57:16 crc kubenswrapper[4676]: I1204 16:57:16.827577 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-n9mgw_314126ca-1837-48ba-a5b3-fa2b752ff6e6/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:16 crc kubenswrapper[4676]: I1204 16:57:16.959158 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-qtl7t_dda1ce80-bdc4-4c1f-a7f0-dc6a630c2fee/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:17 crc kubenswrapper[4676]: I1204 16:57:17.551400 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29414401-6hznt_8cd36f16-1d73-423c-918e-7e1e85929fb7/keystone-cron/0.log" Dec 04 16:57:17 crc kubenswrapper[4676]: I1204 16:57:17.552218 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-74857cd458-nnlq7_062c032e-aef9-4036-8d2b-dc89641ed977/horizon-log/0.log" Dec 04 16:57:17 crc kubenswrapper[4676]: I1204 16:57:17.703942 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_cb6a2b06-d8cb-4925-97c6-90172194a399/kube-state-metrics/0.log" Dec 04 16:57:17 crc kubenswrapper[4676]: I1204 16:57:17.818197 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-97885899c-28t7l_2d21c3e9-53ed-4671-b832-04c115971b6c/keystone-api/0.log" Dec 04 16:57:17 crc kubenswrapper[4676]: I1204 16:57:17.862794 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-dcbhp_9724a435-38f2-4384-b3fe-d5229301866d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:18 crc kubenswrapper[4676]: I1204 16:57:18.377810 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68bd568fd5-srw6v_5eab48dd-24f7-4439-bcc1-29f34b005bda/neutron-httpd/0.log" Dec 04 16:57:18 crc kubenswrapper[4676]: I1204 16:57:18.380007 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xnxct_9ecf8093-2284-4bcf-adb4-c2880f87b7e9/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:18 crc kubenswrapper[4676]: I1204 16:57:18.451405 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-68bd568fd5-srw6v_5eab48dd-24f7-4439-bcc1-29f34b005bda/neutron-api/0.log" Dec 04 16:57:19 crc kubenswrapper[4676]: I1204 16:57:19.066823 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_341ba99e-36fa-4121-978a-de87bfd92b85/nova-cell0-conductor-conductor/0.log" Dec 04 16:57:19 crc kubenswrapper[4676]: I1204 16:57:19.354701 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_f4c8dccd-8cfb-4b04-a035-e7af36e48038/nova-cell1-conductor-conductor/0.log" Dec 04 16:57:19 crc kubenswrapper[4676]: I1204 16:57:19.758432 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0081333b-fdf2-4cc5-924c-3d1ad7dc0419/nova-cell1-novncproxy-novncproxy/0.log" Dec 04 16:57:20 crc kubenswrapper[4676]: I1204 16:57:20.152117 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-px4sr_7841e048-3b6b-4361-a2f5-0d7de2cca7e9/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:20 crc kubenswrapper[4676]: I1204 16:57:20.261738 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2199ce2b-f085-4ad8-8048-d13b4399ff13/nova-api-log/0.log" Dec 04 16:57:20 crc kubenswrapper[4676]: I1204 16:57:20.462315 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e0794dc7-c796-4e57-bf9e-eefb1ac8e72c/nova-metadata-log/0.log" Dec 04 16:57:20 crc kubenswrapper[4676]: I1204 16:57:20.864886 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_2199ce2b-f085-4ad8-8048-d13b4399ff13/nova-api-api/0.log" Dec 04 16:57:20 crc kubenswrapper[4676]: I1204 16:57:20.986691 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c52ad2e5-0a77-4894-8535-30b4e98cdda9/mysql-bootstrap/0.log" Dec 04 16:57:20 crc kubenswrapper[4676]: I1204 16:57:20.991381 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bdf5ba9f-064d-481b-be8f-9682f56de62e/nova-scheduler-scheduler/0.log" Dec 04 16:57:21 crc kubenswrapper[4676]: I1204 16:57:21.219255 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c52ad2e5-0a77-4894-8535-30b4e98cdda9/galera/0.log" Dec 04 16:57:21 crc kubenswrapper[4676]: I1204 16:57:21.249383 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c52ad2e5-0a77-4894-8535-30b4e98cdda9/mysql-bootstrap/0.log" Dec 04 16:57:21 crc kubenswrapper[4676]: I1204 16:57:21.420974 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3588a213-92d7-43d7-8a28-6a9104f1d48e/mysql-bootstrap/0.log" Dec 04 16:57:21 crc kubenswrapper[4676]: I1204 16:57:21.716253 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3588a213-92d7-43d7-8a28-6a9104f1d48e/mysql-bootstrap/0.log" Dec 04 16:57:21 crc kubenswrapper[4676]: I1204 16:57:21.773058 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3588a213-92d7-43d7-8a28-6a9104f1d48e/galera/0.log" Dec 04 16:57:22 crc kubenswrapper[4676]: I1204 16:57:22.028669 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_da921c96-bdd0-4aa2-a98e-9adc22788b75/openstackclient/0.log" Dec 04 16:57:22 crc kubenswrapper[4676]: I1204 16:57:22.096753 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-hdtnf_ce63098e-8737-4061-94ce-2b8c76ccb26f/ovn-controller/0.log" Dec 04 16:57:22 crc kubenswrapper[4676]: I1204 16:57:22.335491 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rj748_3ca82100-5ba8-449c-a122-fbc3277ba4d7/openstack-network-exporter/0.log" Dec 04 16:57:22 crc kubenswrapper[4676]: I1204 16:57:22.687259 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8r4vm_4726f661-5133-4c0f-8f63-5a93481ed0df/ovsdb-server-init/0.log" Dec 04 16:57:22 crc kubenswrapper[4676]: I1204 16:57:22.943536 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8r4vm_4726f661-5133-4c0f-8f63-5a93481ed0df/ovsdb-server/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.008618 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8r4vm_4726f661-5133-4c0f-8f63-5a93481ed0df/ovsdb-server-init/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.205551 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e0794dc7-c796-4e57-bf9e-eefb1ac8e72c/nova-metadata-metadata/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.260867 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-thh6s_a5dbc42d-5e2f-4114-adf7-9bbf7ef7a041/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.346873 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8r4vm_4726f661-5133-4c0f-8f63-5a93481ed0df/ovs-vswitchd/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.427224 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_401b9eed-f3f4-4794-bab2-83bc5fd89deb/openstack-network-exporter/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.509975 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_401b9eed-f3f4-4794-bab2-83bc5fd89deb/ovn-northd/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.628525 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3e3a586c-5d43-4f0a-9f77-038f2a5a0880/openstack-network-exporter/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.700672 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3e3a586c-5d43-4f0a-9f77-038f2a5a0880/ovsdbserver-nb/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.853186 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_163c3f92-f9e6-43bb-8958-c3715f2dae4a/openstack-network-exporter/0.log" Dec 04 16:57:23 crc kubenswrapper[4676]: I1204 16:57:23.890765 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_163c3f92-f9e6-43bb-8958-c3715f2dae4a/ovsdbserver-sb/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.239080 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75f9dc548b-ctwhb_0bea0dc8-b7f4-4623-95a6-813e42180090/placement-api/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.307426 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f6970c56-0104-45cf-a58e-91be763b6054/init-config-reloader/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.355669 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-75f9dc548b-ctwhb_0bea0dc8-b7f4-4623-95a6-813e42180090/placement-log/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.459155 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f6970c56-0104-45cf-a58e-91be763b6054/config-reloader/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.490012 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f6970c56-0104-45cf-a58e-91be763b6054/init-config-reloader/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.550431 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f6970c56-0104-45cf-a58e-91be763b6054/prometheus/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.624634 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_f6970c56-0104-45cf-a58e-91be763b6054/thanos-sidecar/0.log" Dec 04 16:57:24 crc kubenswrapper[4676]: I1204 16:57:24.689628 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90b5e80e-65ee-42be-bf95-72e121d8e888/setup-container/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.014215 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a074e2a9-e6e9-488d-8338-54231ab8faf9/setup-container/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.031887 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90b5e80e-65ee-42be-bf95-72e121d8e888/rabbitmq/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.040209 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_90b5e80e-65ee-42be-bf95-72e121d8e888/setup-container/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.267547 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a074e2a9-e6e9-488d-8338-54231ab8faf9/setup-container/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.286793 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a074e2a9-e6e9-488d-8338-54231ab8faf9/rabbitmq/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.365610 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b2812cb-4bae-4379-89af-005c5629b8f2/setup-container/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.701933 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b2812cb-4bae-4379-89af-005c5629b8f2/setup-container/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.716781 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2b2812cb-4bae-4379-89af-005c5629b8f2/rabbitmq/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.806456 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lbwlb_17492632-88c9-4d92-9804-12228ba0fdad/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:25 crc kubenswrapper[4676]: I1204 16:57:25.885568 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-w64mp_1daaa5df-a3b1-4ac7-9453-f1fa9c4682fd/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.029572 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-s5bsp_43fc84a7-d9a0-4eba-93e6-c72e566a2b99/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.191666 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5bx5h_47048b08-8efe-4c2b-a449-bad99291721d/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.292100 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-kvjcq_ed758cb2-028d-43a2-b04a-3b494673e6f6/ssh-known-hosts-edpm-deployment/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.556573 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78ffb7b6cf-46b4r_10ac9a17-d069-484c-9f44-baaada4618f8/proxy-server/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.650605 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-78ffb7b6cf-46b4r_10ac9a17-d069-484c-9f44-baaada4618f8/proxy-httpd/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.652691 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-ksj54_e03c083b-3422-4f69-9355-7e8354125352/swift-ring-rebalance/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.805952 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/account-auditor/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.868357 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/account-reaper/0.log" Dec 04 16:57:26 crc kubenswrapper[4676]: I1204 16:57:26.914008 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/account-replicator/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.003510 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/container-auditor/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.038304 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/account-server/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.142668 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/container-server/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.160447 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/container-replicator/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.218361 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/container-updater/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.340113 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/object-auditor/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.384314 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:57:27 crc kubenswrapper[4676]: E1204 16:57:27.384591 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.426289 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/object-replicator/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.430136 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/object-expirer/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.445951 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/object-server/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.544925 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/object-updater/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.633875 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/rsync/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.730149 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_61ed17c4-ad81-4738-ac71-3b97f42d5211/swift-recon-cron/0.log" Dec 04 16:57:27 crc kubenswrapper[4676]: I1204 16:57:27.851697 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-h8v2p_739e4574-6964-41c1-833b-3379e794681a/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:28 crc kubenswrapper[4676]: I1204 16:57:28.100252 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2db24e9d-bcf8-4e11-8823-5709bb13d99d/test-operator-logs-container/0.log" Dec 04 16:57:28 crc kubenswrapper[4676]: I1204 16:57:28.251177 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-pbhrd_c2ce3b93-6fd0-432c-8f42-99cc96bd0aca/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 04 16:57:28 crc kubenswrapper[4676]: I1204 16:57:28.918941 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1728d401-fbd4-470d-8084-deaa0ca6c1b5/tempest-tests-tempest-tests-runner/0.log" Dec 04 16:57:29 crc kubenswrapper[4676]: I1204 16:57:29.685867 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_b8700a65-1419-4467-8d99-2085481c5890/watcher-applier/0.log" Dec 04 16:57:29 crc kubenswrapper[4676]: I1204 16:57:29.997071 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5bd9cd7f-a3cb-4304-9ce9-73903875b9cd/watcher-api-log/0.log" Dec 04 16:57:32 crc kubenswrapper[4676]: I1204 16:57:32.841083 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_d97e77cc-3e3e-4d05-b57e-b87782f3ada8/watcher-decision-engine/0.log" Dec 04 16:57:34 crc kubenswrapper[4676]: I1204 16:57:34.315019 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5bd9cd7f-a3cb-4304-9ce9-73903875b9cd/watcher-api/0.log" Dec 04 16:57:38 crc kubenswrapper[4676]: I1204 16:57:38.712884 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_12baa943-6113-449f-ac06-88dd60e224fe/memcached/0.log" Dec 04 16:57:40 crc kubenswrapper[4676]: I1204 16:57:40.385314 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:57:40 crc kubenswrapper[4676]: E1204 16:57:40.387133 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:57:54 crc kubenswrapper[4676]: I1204 16:57:54.385763 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:57:54 crc kubenswrapper[4676]: E1204 16:57:54.387025 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:57:57 crc kubenswrapper[4676]: I1204 16:57:57.685067 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn_77e9ca65-5ca8-4d5d-8b88-080a95a82529/util/0.log" Dec 04 16:57:57 crc kubenswrapper[4676]: I1204 16:57:57.911853 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn_77e9ca65-5ca8-4d5d-8b88-080a95a82529/util/0.log" Dec 04 16:57:57 crc kubenswrapper[4676]: I1204 16:57:57.969085 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn_77e9ca65-5ca8-4d5d-8b88-080a95a82529/pull/0.log" Dec 04 16:57:57 crc kubenswrapper[4676]: I1204 16:57:57.986614 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn_77e9ca65-5ca8-4d5d-8b88-080a95a82529/pull/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.198849 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn_77e9ca65-5ca8-4d5d-8b88-080a95a82529/util/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.225864 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn_77e9ca65-5ca8-4d5d-8b88-080a95a82529/pull/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.267151 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_6e52e659d32fb12f5ab7255cfab541613f5294ba28fd8a1d5d6fee802f4vdhn_77e9ca65-5ca8-4d5d-8b88-080a95a82529/extract/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.376688 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-p52sj_191599a4-dee2-4d6c-b7ba-09e4f60faaf5/kube-rbac-proxy/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.488014 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-5bfbbb859d-p52sj_191599a4-dee2-4d6c-b7ba-09e4f60faaf5/manager/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.511728 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748967c98-zbsm7_db83cc98-e9f7-4c8a-989a-ad3150de91b9/kube-rbac-proxy/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.650730 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-748967c98-zbsm7_db83cc98-e9f7-4c8a-989a-ad3150de91b9/manager/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.701989 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-hrr7c_c7bf3f72-274b-4db9-8822-25999acad8b6/manager/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.711544 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6788cc6d75-hrr7c_c7bf3f72-274b-4db9-8822-25999acad8b6/kube-rbac-proxy/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.921396 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-85fbd69fcd-7vsrd_25a6adcc-b6f7-41ee-a0d3-9594455bedda/kube-rbac-proxy/0.log" Dec 04 16:57:58 crc kubenswrapper[4676]: I1204 16:57:58.952618 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-85fbd69fcd-7vsrd_25a6adcc-b6f7-41ee-a0d3-9594455bedda/manager/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.099489 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-h74fd_dbba238e-b271-48f0-9356-c1ba4b7446f8/manager/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.109521 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-698d6fd7d6-h74fd_dbba238e-b271-48f0-9356-c1ba4b7446f8/kube-rbac-proxy/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.167793 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-vf7rm_1b01dbe4-9e3e-403e-938a-22f130b47202/kube-rbac-proxy/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.325970 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6c55d8d69b-jjrmb_02e4b1ff-3345-4104-b333-cba2f5cd9388/kube-rbac-proxy/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.334393 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7d5d9fd47f-vf7rm_1b01dbe4-9e3e-403e-938a-22f130b47202/manager/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.567893 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-zgqv7_171288d7-22db-4357-bbfc-0f5ffa6b709c/kube-rbac-proxy/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.589015 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6c55d8d69b-jjrmb_02e4b1ff-3345-4104-b333-cba2f5cd9388/manager/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.609193 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-54485f899-zgqv7_171288d7-22db-4357-bbfc-0f5ffa6b709c/manager/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.779164 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-79cc9d59f5-tqc5z_62a08aac-45ea-4944-9d7f-9d78114d07a0/kube-rbac-proxy/0.log" Dec 04 16:57:59 crc kubenswrapper[4676]: I1204 16:57:59.817197 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-79cc9d59f5-tqc5z_62a08aac-45ea-4944-9d7f-9d78114d07a0/manager/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.001489 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5cbc8c7f96-lpl84_d28e781c-96cf-4377-8cbc-f32b112e3dc7/kube-rbac-proxy/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.029729 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5cbc8c7f96-lpl84_d28e781c-96cf-4377-8cbc-f32b112e3dc7/manager/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.049526 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-5nstv_ee1e0a33-feb5-4a3b-8d62-dca835529d5e/kube-rbac-proxy/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.210500 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-64d7c556cd-5nstv_ee1e0a33-feb5-4a3b-8d62-dca835529d5e/manager/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.272141 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-58879495c-g2b6v_f5882b54-a120-4eff-88e8-bf0a5d7758ff/kube-rbac-proxy/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.272420 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-58879495c-g2b6v_f5882b54-a120-4eff-88e8-bf0a5d7758ff/manager/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.409860 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-nxgnw_7d5162d9-add8-44b3-8301-82cbd7d09878/kube-rbac-proxy/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.551128 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79d658b66d-nxgnw_7d5162d9-add8-44b3-8301-82cbd7d09878/manager/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.605363 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-d5fb87cb8-tg7br_9890ab17-b307-4506-9420-0a50e671792e/kube-rbac-proxy/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.678762 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-d5fb87cb8-tg7br_9890ab17-b307-4506-9420-0a50e671792e/manager/0.log" Dec 04 16:58:00 crc kubenswrapper[4676]: I1204 16:58:00.997519 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-f29bx_8a67582d-5c84-40fc-977b-4c0d42d9864b/kube-rbac-proxy/0.log" Dec 04 16:58:01 crc kubenswrapper[4676]: I1204 16:58:01.052362 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77868f484-f29bx_8a67582d-5c84-40fc-977b-4c0d42d9864b/manager/0.log" Dec 04 16:58:01 crc kubenswrapper[4676]: I1204 16:58:01.179058 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b8756448-bqf62_468399f0-8b75-47d3-9576-fc4f572fc422/kube-rbac-proxy/0.log" Dec 04 16:58:01 crc kubenswrapper[4676]: I1204 16:58:01.566836 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-577c877dd7-7ktcv_27c20c8b-a18c-40e3-a45f-4cf9b1fb4510/kube-rbac-proxy/0.log" Dec 04 16:58:01 crc kubenswrapper[4676]: I1204 16:58:01.795933 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-577c877dd7-7ktcv_27c20c8b-a18c-40e3-a45f-4cf9b1fb4510/operator/0.log" Dec 04 16:58:01 crc kubenswrapper[4676]: I1204 16:58:01.802991 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fbzg5_24f18240-bbb2-4c1c-b396-e5d2a6d44514/registry-server/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.020991 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-7g426_d373173f-fba9-4fc1-9d7d-5424dca0303e/kube-rbac-proxy/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.066525 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5b67cfc8fb-7g426_d373173f-fba9-4fc1-9d7d-5424dca0303e/manager/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.147103 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-t4p48_53683a17-2c47-4b4c-b145-74620d4d7a16/kube-rbac-proxy/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.298995 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-867d87977b-t4p48_53683a17-2c47-4b4c-b145-74620d4d7a16/manager/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.343822 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-ngw55_3b483864-ee9a-49b1-b75f-5f9b23e9534d/operator/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.509891 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-8f6687c44-24pgj_255159ec-7751-4663-a0b9-0e97f9c0824d/kube-rbac-proxy/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.559928 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6b8756448-bqf62_468399f0-8b75-47d3-9576-fc4f572fc422/manager/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.573706 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-695797c565-mtxgd_66135fe6-10ac-4049-b7a7-e40aa82f78e7/kube-rbac-proxy/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.605715 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-8f6687c44-24pgj_255159ec-7751-4663-a0b9-0e97f9c0824d/manager/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.759575 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-bb86466d8-x7nbg_a2059da3-6c0d-4623-8406-5f25aed58fbf/kube-rbac-proxy/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.819442 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-bb86466d8-x7nbg_a2059da3-6c0d-4623-8406-5f25aed58fbf/manager/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.868527 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-695797c565-mtxgd_66135fe6-10ac-4049-b7a7-e40aa82f78e7/manager/0.log" Dec 04 16:58:02 crc kubenswrapper[4676]: I1204 16:58:02.964420 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c44f899f9-n7xc5_93e0c78f-854f-4c11-b457-f5e1b429a7bc/kube-rbac-proxy/0.log" Dec 04 16:58:03 crc kubenswrapper[4676]: I1204 16:58:03.045299 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c44f899f9-n7xc5_93e0c78f-854f-4c11-b457-f5e1b429a7bc/manager/0.log" Dec 04 16:58:07 crc kubenswrapper[4676]: I1204 16:58:07.387154 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:58:07 crc kubenswrapper[4676]: E1204 16:58:07.387974 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:58:20 crc kubenswrapper[4676]: I1204 16:58:20.384255 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:58:20 crc kubenswrapper[4676]: E1204 16:58:20.385154 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:58:21 crc kubenswrapper[4676]: I1204 16:58:21.994115 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-5hd4h_29205e6d-74be-4a99-b92d-50152cb21845/control-plane-machine-set-operator/0.log" Dec 04 16:58:22 crc kubenswrapper[4676]: I1204 16:58:22.142186 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8k7hs_76f9c064-9769-41c0-8936-340f895bc36d/kube-rbac-proxy/0.log" Dec 04 16:58:22 crc kubenswrapper[4676]: I1204 16:58:22.235217 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-8k7hs_76f9c064-9769-41c0-8936-340f895bc36d/machine-api-operator/0.log" Dec 04 16:58:35 crc kubenswrapper[4676]: I1204 16:58:35.384518 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:58:35 crc kubenswrapper[4676]: E1204 16:58:35.385421 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:58:35 crc kubenswrapper[4676]: I1204 16:58:35.706103 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-ts58n_9df19d98-0550-4720-bed1-056a83f77d6b/cert-manager-controller/0.log" Dec 04 16:58:35 crc kubenswrapper[4676]: I1204 16:58:35.755486 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-kf2h4_b3a1fea5-f2ce-4047-b055-35cdaadd95c2/cert-manager-cainjector/0.log" Dec 04 16:58:35 crc kubenswrapper[4676]: I1204 16:58:35.881255 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-stttj_b0d87ab1-b5cd-4ef9-8bc4-f7cd211eeef4/cert-manager-webhook/0.log" Dec 04 16:58:49 crc kubenswrapper[4676]: I1204 16:58:49.365126 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-cpjcs_c4a94816-54e1-4cde-87cd-130411826243/nmstate-console-plugin/0.log" Dec 04 16:58:49 crc kubenswrapper[4676]: I1204 16:58:49.385026 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:58:49 crc kubenswrapper[4676]: E1204 16:58:49.385314 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:58:49 crc kubenswrapper[4676]: I1204 16:58:49.537862 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5tgzz_3a01cabf-b256-487e-840b-db8b85c3de85/kube-rbac-proxy/0.log" Dec 04 16:58:49 crc kubenswrapper[4676]: I1204 16:58:49.574928 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-5tgzz_3a01cabf-b256-487e-840b-db8b85c3de85/nmstate-metrics/0.log" Dec 04 16:58:49 crc kubenswrapper[4676]: I1204 16:58:49.595342 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-s57t5_fb8265ae-de57-4ac5-9804-d3becd3a48d5/nmstate-handler/0.log" Dec 04 16:58:49 crc kubenswrapper[4676]: I1204 16:58:49.771266 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8sbbg_8cbb02ff-f891-4887-b834-ba6f1cf7274c/nmstate-webhook/0.log" Dec 04 16:58:49 crc kubenswrapper[4676]: I1204 16:58:49.823673 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-w7zxf_d88f4c5a-fc64-4912-a4c4-7a2af156aa3f/nmstate-operator/0.log" Dec 04 16:59:00 crc kubenswrapper[4676]: I1204 16:59:00.384087 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:59:00 crc kubenswrapper[4676]: E1204 16:59:00.384761 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:59:04 crc kubenswrapper[4676]: I1204 16:59:04.972306 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-5p58x_4d025efd-41d1-4aa2-8bdf-348a4e378082/kube-rbac-proxy/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.168452 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-5p58x_4d025efd-41d1-4aa2-8bdf-348a4e378082/controller/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.244733 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-frr-files/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.699467 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-reloader/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.708708 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-reloader/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.722106 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-metrics/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.729104 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-frr-files/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.956316 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-reloader/0.log" Dec 04 16:59:05 crc kubenswrapper[4676]: I1204 16:59:05.975012 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-metrics/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.004531 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-frr-files/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.010415 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-metrics/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.201368 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-frr-files/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.208404 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/controller/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.237645 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-reloader/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.340755 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/cp-metrics/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.440803 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/frr-metrics/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.460646 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/kube-rbac-proxy/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.601796 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/kube-rbac-proxy-frr/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.691490 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/reloader/0.log" Dec 04 16:59:06 crc kubenswrapper[4676]: I1204 16:59:06.887746 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-7g2tv_652c71f4-1df3-45cb-9540-fac675f8134f/frr-k8s-webhook-server/0.log" Dec 04 16:59:07 crc kubenswrapper[4676]: I1204 16:59:07.102487 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d9899ddf8-t2gzf_e2b1da94-9d99-4645-af39-b9429c50896e/manager/0.log" Dec 04 16:59:07 crc kubenswrapper[4676]: I1204 16:59:07.342386 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-55ff4bc57f-ctsr2_c5df83ac-ab2b-4fbb-8f48-f8e2c7eca443/webhook-server/0.log" Dec 04 16:59:07 crc kubenswrapper[4676]: I1204 16:59:07.528157 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pn92f_a4165e19-a60f-458e-904c-9092df340dd0/kube-rbac-proxy/0.log" Dec 04 16:59:08 crc kubenswrapper[4676]: I1204 16:59:08.077444 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pn92f_a4165e19-a60f-458e-904c-9092df340dd0/speaker/0.log" Dec 04 16:59:08 crc kubenswrapper[4676]: I1204 16:59:08.355850 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r4r27_e0d02430-19e7-4515-ac98-59549551ec90/frr/0.log" Dec 04 16:59:11 crc kubenswrapper[4676]: I1204 16:59:11.384489 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:59:11 crc kubenswrapper[4676]: E1204 16:59:11.385344 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 16:59:22 crc kubenswrapper[4676]: I1204 16:59:22.279293 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6_d43defed-8b48-4daa-83b5-3b44b845c0d8/util/0.log" Dec 04 16:59:22 crc kubenswrapper[4676]: I1204 16:59:22.499327 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6_d43defed-8b48-4daa-83b5-3b44b845c0d8/pull/0.log" Dec 04 16:59:22 crc kubenswrapper[4676]: I1204 16:59:22.501387 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6_d43defed-8b48-4daa-83b5-3b44b845c0d8/util/0.log" Dec 04 16:59:22 crc kubenswrapper[4676]: I1204 16:59:22.557459 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6_d43defed-8b48-4daa-83b5-3b44b845c0d8/pull/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.084842 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6_d43defed-8b48-4daa-83b5-3b44b845c0d8/pull/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.131774 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6_d43defed-8b48-4daa-83b5-3b44b845c0d8/util/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.177960 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fwx4b6_d43defed-8b48-4daa-83b5-3b44b845c0d8/extract/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.347997 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r_6d73b25e-dd84-468b-81dd-5d584a083fe0/util/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.480753 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r_6d73b25e-dd84-468b-81dd-5d584a083fe0/pull/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.493017 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r_6d73b25e-dd84-468b-81dd-5d584a083fe0/util/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.524996 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r_6d73b25e-dd84-468b-81dd-5d584a083fe0/pull/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.633209 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r_6d73b25e-dd84-468b-81dd-5d584a083fe0/util/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.663650 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r_6d73b25e-dd84-468b-81dd-5d584a083fe0/pull/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.726803 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210xpn9r_6d73b25e-dd84-468b-81dd-5d584a083fe0/extract/0.log" Dec 04 16:59:23 crc kubenswrapper[4676]: I1204 16:59:23.807131 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4_c1aa4cb1-4632-4d55-a604-7a1a853ba9c6/util/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.003207 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4_c1aa4cb1-4632-4d55-a604-7a1a853ba9c6/pull/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.022993 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4_c1aa4cb1-4632-4d55-a604-7a1a853ba9c6/pull/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.029409 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4_c1aa4cb1-4632-4d55-a604-7a1a853ba9c6/util/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.236257 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4_c1aa4cb1-4632-4d55-a604-7a1a853ba9c6/util/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.264823 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4_c1aa4cb1-4632-4d55-a604-7a1a853ba9c6/extract/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.272953 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83d6zj4_c1aa4cb1-4632-4d55-a604-7a1a853ba9c6/pull/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.415837 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csmsh_3d5c92e1-8000-4f13-8480-1e099388fbfa/extract-utilities/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.634261 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csmsh_3d5c92e1-8000-4f13-8480-1e099388fbfa/extract-content/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.634953 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csmsh_3d5c92e1-8000-4f13-8480-1e099388fbfa/extract-content/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.636152 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csmsh_3d5c92e1-8000-4f13-8480-1e099388fbfa/extract-utilities/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.855602 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csmsh_3d5c92e1-8000-4f13-8480-1e099388fbfa/extract-utilities/0.log" Dec 04 16:59:24 crc kubenswrapper[4676]: I1204 16:59:24.924322 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csmsh_3d5c92e1-8000-4f13-8480-1e099388fbfa/extract-content/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.027550 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-csmsh_3d5c92e1-8000-4f13-8480-1e099388fbfa/registry-server/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.097189 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mb2t7_e655c075-09f9-4409-a370-0acced242279/extract-utilities/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.280481 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mb2t7_e655c075-09f9-4409-a370-0acced242279/extract-content/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.297091 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mb2t7_e655c075-09f9-4409-a370-0acced242279/extract-utilities/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.324887 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mb2t7_e655c075-09f9-4409-a370-0acced242279/extract-content/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.384250 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.516432 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mb2t7_e655c075-09f9-4409-a370-0acced242279/extract-utilities/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.595081 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mb2t7_e655c075-09f9-4409-a370-0acced242279/extract-content/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.627931 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"c3dbdc80a9ba931ebb1a3965991f32c779f274f9df67d1cdb31508db343b2eb5"} Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.907293 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jgxsk_84e2ecfe-0652-42eb-9440-0b03a4722150/marketplace-operator/0.log" Dec 04 16:59:25 crc kubenswrapper[4676]: I1204 16:59:25.952531 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b6wkg_b7ad7e78-7f85-4c56-8aa9-12aeef76c043/extract-utilities/0.log" Dec 04 16:59:26 crc kubenswrapper[4676]: I1204 16:59:26.591121 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b6wkg_b7ad7e78-7f85-4c56-8aa9-12aeef76c043/extract-utilities/0.log" Dec 04 16:59:26 crc kubenswrapper[4676]: I1204 16:59:26.687518 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b6wkg_b7ad7e78-7f85-4c56-8aa9-12aeef76c043/extract-content/0.log" Dec 04 16:59:26 crc kubenswrapper[4676]: I1204 16:59:26.895029 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b6wkg_b7ad7e78-7f85-4c56-8aa9-12aeef76c043/extract-content/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.003360 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mb2t7_e655c075-09f9-4409-a370-0acced242279/registry-server/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.066844 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b6wkg_b7ad7e78-7f85-4c56-8aa9-12aeef76c043/extract-utilities/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.118024 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b6wkg_b7ad7e78-7f85-4c56-8aa9-12aeef76c043/extract-content/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.228898 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhjg6_3911d80c-3e19-4fbf-ace6-752742bea61a/extract-utilities/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.329665 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-b6wkg_b7ad7e78-7f85-4c56-8aa9-12aeef76c043/registry-server/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.449419 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhjg6_3911d80c-3e19-4fbf-ace6-752742bea61a/extract-content/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.466842 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhjg6_3911d80c-3e19-4fbf-ace6-752742bea61a/extract-utilities/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.475701 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhjg6_3911d80c-3e19-4fbf-ace6-752742bea61a/extract-content/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.645174 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhjg6_3911d80c-3e19-4fbf-ace6-752742bea61a/extract-content/0.log" Dec 04 16:59:27 crc kubenswrapper[4676]: I1204 16:59:27.667626 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhjg6_3911d80c-3e19-4fbf-ace6-752742bea61a/extract-utilities/0.log" Dec 04 16:59:28 crc kubenswrapper[4676]: I1204 16:59:28.195739 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhjg6_3911d80c-3e19-4fbf-ace6-752742bea61a/registry-server/0.log" Dec 04 16:59:40 crc kubenswrapper[4676]: I1204 16:59:40.489995 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-vc8st_cc91d5b7-ea24-4585-9ac8-bd227c1a186e/prometheus-operator/0.log" Dec 04 16:59:40 crc kubenswrapper[4676]: I1204 16:59:40.689842 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6ddf4f6df-wsbsx_6d99fa21-223e-4928-a32c-52a3ccbd69d4/prometheus-operator-admission-webhook/0.log" Dec 04 16:59:40 crc kubenswrapper[4676]: I1204 16:59:40.693937 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6ddf4f6df-rlwrl_ff573696-bc37-470b-a8b6-14c5218baa8f/prometheus-operator-admission-webhook/0.log" Dec 04 16:59:40 crc kubenswrapper[4676]: I1204 16:59:40.909302 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-fl2ph_88a9075d-6a0a-4172-b28c-979ad7fff84b/operator/0.log" Dec 04 16:59:40 crc kubenswrapper[4676]: I1204 16:59:40.924285 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-lxcjx_667d2ce6-ef89-4b36-a200-194e5f7861ad/perses-operator/0.log" Dec 04 16:59:52 crc kubenswrapper[4676]: E1204 16:59:52.286876 4676 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.158:57912->38.102.83.158:40877: write tcp 38.102.83.158:57912->38.102.83.158:40877: write: broken pipe Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.157445 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt"] Dec 04 17:00:00 crc kubenswrapper[4676]: E1204 17:00:00.158808 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3f02b46-8df8-40a5-89b5-72b94a15c519" containerName="container-00" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.158849 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3f02b46-8df8-40a5-89b5-72b94a15c519" containerName="container-00" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.159254 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3f02b46-8df8-40a5-89b5-72b94a15c519" containerName="container-00" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.160501 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.163061 4676 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.164239 4676 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.171132 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt"] Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.227044 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmw79\" (UniqueName: \"kubernetes.io/projected/5fd252fe-7580-459f-9313-25fae3db2dbd-kube-api-access-jmw79\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.227125 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd252fe-7580-459f-9313-25fae3db2dbd-secret-volume\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.227194 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd252fe-7580-459f-9313-25fae3db2dbd-config-volume\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.328843 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd252fe-7580-459f-9313-25fae3db2dbd-secret-volume\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.328936 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd252fe-7580-459f-9313-25fae3db2dbd-config-volume\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.329129 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmw79\" (UniqueName: \"kubernetes.io/projected/5fd252fe-7580-459f-9313-25fae3db2dbd-kube-api-access-jmw79\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.330267 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd252fe-7580-459f-9313-25fae3db2dbd-config-volume\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.338488 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd252fe-7580-459f-9313-25fae3db2dbd-secret-volume\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.364664 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmw79\" (UniqueName: \"kubernetes.io/projected/5fd252fe-7580-459f-9313-25fae3db2dbd-kube-api-access-jmw79\") pod \"collect-profiles-29414460-djknt\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:00 crc kubenswrapper[4676]: I1204 17:00:00.497135 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:01 crc kubenswrapper[4676]: I1204 17:00:01.011891 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt"] Dec 04 17:00:01 crc kubenswrapper[4676]: I1204 17:00:01.991975 4676 generic.go:334] "Generic (PLEG): container finished" podID="5fd252fe-7580-459f-9313-25fae3db2dbd" containerID="d9d353c7e520af08890a10a2212eacb7ff0225d4e430e8e487f3f3b0fb1fee58" exitCode=0 Dec 04 17:00:01 crc kubenswrapper[4676]: I1204 17:00:01.992067 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" event={"ID":"5fd252fe-7580-459f-9313-25fae3db2dbd","Type":"ContainerDied","Data":"d9d353c7e520af08890a10a2212eacb7ff0225d4e430e8e487f3f3b0fb1fee58"} Dec 04 17:00:01 crc kubenswrapper[4676]: I1204 17:00:01.992324 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" event={"ID":"5fd252fe-7580-459f-9313-25fae3db2dbd","Type":"ContainerStarted","Data":"aab0c31ec0da46f03b60a55e82ed94d988f2193460f3b2f8b18d224ed1ded6ba"} Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.399749 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.411447 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd252fe-7580-459f-9313-25fae3db2dbd-secret-volume\") pod \"5fd252fe-7580-459f-9313-25fae3db2dbd\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.411499 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmw79\" (UniqueName: \"kubernetes.io/projected/5fd252fe-7580-459f-9313-25fae3db2dbd-kube-api-access-jmw79\") pod \"5fd252fe-7580-459f-9313-25fae3db2dbd\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.411578 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd252fe-7580-459f-9313-25fae3db2dbd-config-volume\") pod \"5fd252fe-7580-459f-9313-25fae3db2dbd\" (UID: \"5fd252fe-7580-459f-9313-25fae3db2dbd\") " Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.413127 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fd252fe-7580-459f-9313-25fae3db2dbd-config-volume" (OuterVolumeSpecName: "config-volume") pod "5fd252fe-7580-459f-9313-25fae3db2dbd" (UID: "5fd252fe-7580-459f-9313-25fae3db2dbd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.417437 4676 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd252fe-7580-459f-9313-25fae3db2dbd-config-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.424244 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fd252fe-7580-459f-9313-25fae3db2dbd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5fd252fe-7580-459f-9313-25fae3db2dbd" (UID: "5fd252fe-7580-459f-9313-25fae3db2dbd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.451154 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd252fe-7580-459f-9313-25fae3db2dbd-kube-api-access-jmw79" (OuterVolumeSpecName: "kube-api-access-jmw79") pod "5fd252fe-7580-459f-9313-25fae3db2dbd" (UID: "5fd252fe-7580-459f-9313-25fae3db2dbd"). InnerVolumeSpecName "kube-api-access-jmw79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.519532 4676 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5fd252fe-7580-459f-9313-25fae3db2dbd-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 04 17:00:03 crc kubenswrapper[4676]: I1204 17:00:03.519577 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmw79\" (UniqueName: \"kubernetes.io/projected/5fd252fe-7580-459f-9313-25fae3db2dbd-kube-api-access-jmw79\") on node \"crc\" DevicePath \"\"" Dec 04 17:00:04 crc kubenswrapper[4676]: I1204 17:00:04.010063 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" event={"ID":"5fd252fe-7580-459f-9313-25fae3db2dbd","Type":"ContainerDied","Data":"aab0c31ec0da46f03b60a55e82ed94d988f2193460f3b2f8b18d224ed1ded6ba"} Dec 04 17:00:04 crc kubenswrapper[4676]: I1204 17:00:04.010339 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aab0c31ec0da46f03b60a55e82ed94d988f2193460f3b2f8b18d224ed1ded6ba" Dec 04 17:00:04 crc kubenswrapper[4676]: I1204 17:00:04.010160 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29414460-djknt" Dec 04 17:00:04 crc kubenswrapper[4676]: I1204 17:00:04.484174 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz"] Dec 04 17:00:04 crc kubenswrapper[4676]: I1204 17:00:04.493708 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29414415-98jrz"] Dec 04 17:00:05 crc kubenswrapper[4676]: I1204 17:00:05.407255 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9026fa1-14f7-4dfe-90bd-c8fb160f18a0" path="/var/lib/kubelet/pods/b9026fa1-14f7-4dfe-90bd-c8fb160f18a0/volumes" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.193378 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ffhjz"] Dec 04 17:00:29 crc kubenswrapper[4676]: E1204 17:00:29.194366 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd252fe-7580-459f-9313-25fae3db2dbd" containerName="collect-profiles" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.194382 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd252fe-7580-459f-9313-25fae3db2dbd" containerName="collect-profiles" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.194627 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd252fe-7580-459f-9313-25fae3db2dbd" containerName="collect-profiles" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.196431 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.208810 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffhjz"] Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.279080 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-utilities\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.279139 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9m4\" (UniqueName: \"kubernetes.io/projected/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-kube-api-access-tx9m4\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.279410 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-catalog-content\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.381136 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-catalog-content\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.381363 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-utilities\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.381405 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9m4\" (UniqueName: \"kubernetes.io/projected/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-kube-api-access-tx9m4\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.381821 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-catalog-content\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.381863 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-utilities\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.408003 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9m4\" (UniqueName: \"kubernetes.io/projected/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-kube-api-access-tx9m4\") pod \"redhat-operators-ffhjz\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:29 crc kubenswrapper[4676]: I1204 17:00:29.531305 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:30 crc kubenswrapper[4676]: I1204 17:00:30.005829 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffhjz"] Dec 04 17:00:30 crc kubenswrapper[4676]: I1204 17:00:30.423943 4676 generic.go:334] "Generic (PLEG): container finished" podID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerID="832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3" exitCode=0 Dec 04 17:00:30 crc kubenswrapper[4676]: I1204 17:00:30.423999 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffhjz" event={"ID":"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596","Type":"ContainerDied","Data":"832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3"} Dec 04 17:00:30 crc kubenswrapper[4676]: I1204 17:00:30.424028 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffhjz" event={"ID":"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596","Type":"ContainerStarted","Data":"ba3a237252cb8c1746dc722e46f82902a2015c16656a486c798ea6150f7e2774"} Dec 04 17:00:30 crc kubenswrapper[4676]: I1204 17:00:30.429363 4676 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 04 17:00:32 crc kubenswrapper[4676]: I1204 17:00:32.447442 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffhjz" event={"ID":"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596","Type":"ContainerStarted","Data":"243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9"} Dec 04 17:00:38 crc kubenswrapper[4676]: I1204 17:00:38.927961 4676 generic.go:334] "Generic (PLEG): container finished" podID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerID="243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9" exitCode=0 Dec 04 17:00:38 crc kubenswrapper[4676]: I1204 17:00:38.932593 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffhjz" event={"ID":"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596","Type":"ContainerDied","Data":"243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9"} Dec 04 17:00:41 crc kubenswrapper[4676]: I1204 17:00:41.104696 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffhjz" event={"ID":"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596","Type":"ContainerStarted","Data":"b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3"} Dec 04 17:00:41 crc kubenswrapper[4676]: I1204 17:00:41.135626 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ffhjz" podStartSLOduration=2.530135836 podStartE2EDuration="12.135607859s" podCreationTimestamp="2025-12-04 17:00:29 +0000 UTC" firstStartedPulling="2025-12-04 17:00:30.429139103 +0000 UTC m=+6037.863808960" lastFinishedPulling="2025-12-04 17:00:40.034611126 +0000 UTC m=+6047.469280983" observedRunningTime="2025-12-04 17:00:41.133588881 +0000 UTC m=+6048.568258738" watchObservedRunningTime="2025-12-04 17:00:41.135607859 +0000 UTC m=+6048.570277706" Dec 04 17:00:49 crc kubenswrapper[4676]: I1204 17:00:49.532396 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:49 crc kubenswrapper[4676]: I1204 17:00:49.532930 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:50 crc kubenswrapper[4676]: I1204 17:00:50.587290 4676 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ffhjz" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="registry-server" probeResult="failure" output=< Dec 04 17:00:50 crc kubenswrapper[4676]: timeout: failed to connect service ":50051" within 1s Dec 04 17:00:50 crc kubenswrapper[4676]: > Dec 04 17:00:59 crc kubenswrapper[4676]: I1204 17:00:59.604302 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:00:59 crc kubenswrapper[4676]: I1204 17:00:59.654806 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.173720 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29414461-7x2bj"] Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.175281 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.186826 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414461-7x2bj"] Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.235151 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-config-data\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.235232 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-combined-ca-bundle\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.235272 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-fernet-keys\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.235406 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqmfm\" (UniqueName: \"kubernetes.io/projected/66d21b15-c63b-4b71-9c05-21ca7e59655a-kube-api-access-bqmfm\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.338030 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqmfm\" (UniqueName: \"kubernetes.io/projected/66d21b15-c63b-4b71-9c05-21ca7e59655a-kube-api-access-bqmfm\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.338379 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-config-data\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.338581 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-combined-ca-bundle\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.338692 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-fernet-keys\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.345914 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-fernet-keys\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.346216 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-combined-ca-bundle\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.358535 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-config-data\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.360845 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqmfm\" (UniqueName: \"kubernetes.io/projected/66d21b15-c63b-4b71-9c05-21ca7e59655a-kube-api-access-bqmfm\") pod \"keystone-cron-29414461-7x2bj\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.556824 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:00 crc kubenswrapper[4676]: I1204 17:01:00.786838 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffhjz"] Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.017222 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29414461-7x2bj"] Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.433738 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414461-7x2bj" event={"ID":"66d21b15-c63b-4b71-9c05-21ca7e59655a","Type":"ContainerStarted","Data":"6eeb3c7321143249969641aecad9e8411d7a4e19f317bbdb62fdf40faea3d741"} Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.433784 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414461-7x2bj" event={"ID":"66d21b15-c63b-4b71-9c05-21ca7e59655a","Type":"ContainerStarted","Data":"beee813ef2401f0126fb20e24b592f448e331895aa4bc0c157e142178e699dfa"} Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.433878 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ffhjz" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="registry-server" containerID="cri-o://b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3" gracePeriod=2 Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.456690 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29414461-7x2bj" podStartSLOduration=1.456674897 podStartE2EDuration="1.456674897s" podCreationTimestamp="2025-12-04 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-04 17:01:01.455188425 +0000 UTC m=+6068.889858282" watchObservedRunningTime="2025-12-04 17:01:01.456674897 +0000 UTC m=+6068.891344754" Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.913264 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.960023 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-utilities\") pod \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.960316 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx9m4\" (UniqueName: \"kubernetes.io/projected/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-kube-api-access-tx9m4\") pod \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.960461 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-catalog-content\") pod \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\" (UID: \"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596\") " Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.961247 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-utilities" (OuterVolumeSpecName: "utilities") pod "7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" (UID: "7491c3bb-e6c6-43ac-9cb2-9dc4637fe596"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:01:01 crc kubenswrapper[4676]: I1204 17:01:01.966179 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-kube-api-access-tx9m4" (OuterVolumeSpecName: "kube-api-access-tx9m4") pod "7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" (UID: "7491c3bb-e6c6-43ac-9cb2-9dc4637fe596"). InnerVolumeSpecName "kube-api-access-tx9m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.064311 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx9m4\" (UniqueName: \"kubernetes.io/projected/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-kube-api-access-tx9m4\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.064588 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.071767 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" (UID: "7491c3bb-e6c6-43ac-9cb2-9dc4637fe596"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.167103 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.202203 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vp7r2"] Dec 04 17:01:02 crc kubenswrapper[4676]: E1204 17:01:02.202716 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="extract-utilities" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.202729 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="extract-utilities" Dec 04 17:01:02 crc kubenswrapper[4676]: E1204 17:01:02.202745 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="registry-server" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.202752 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="registry-server" Dec 04 17:01:02 crc kubenswrapper[4676]: E1204 17:01:02.202766 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="extract-content" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.202772 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="extract-content" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.202980 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerName="registry-server" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.204787 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.212793 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vp7r2"] Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.268740 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-utilities\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.269258 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-catalog-content\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.269340 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknvn\" (UniqueName: \"kubernetes.io/projected/5a085244-1140-4f9c-8966-cf2ce3d1d074-kube-api-access-hknvn\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.371414 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hknvn\" (UniqueName: \"kubernetes.io/projected/5a085244-1140-4f9c-8966-cf2ce3d1d074-kube-api-access-hknvn\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.371574 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-utilities\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.371732 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-catalog-content\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.372175 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-utilities\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.372202 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-catalog-content\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.392355 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknvn\" (UniqueName: \"kubernetes.io/projected/5a085244-1140-4f9c-8966-cf2ce3d1d074-kube-api-access-hknvn\") pod \"community-operators-vp7r2\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.444995 4676 generic.go:334] "Generic (PLEG): container finished" podID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" containerID="b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3" exitCode=0 Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.445076 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffhjz" event={"ID":"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596","Type":"ContainerDied","Data":"b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3"} Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.445360 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffhjz" event={"ID":"7491c3bb-e6c6-43ac-9cb2-9dc4637fe596","Type":"ContainerDied","Data":"ba3a237252cb8c1746dc722e46f82902a2015c16656a486c798ea6150f7e2774"} Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.445404 4676 scope.go:117] "RemoveContainer" containerID="b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.445088 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffhjz" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.475154 4676 scope.go:117] "RemoveContainer" containerID="243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.498792 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffhjz"] Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.508711 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ffhjz"] Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.534395 4676 scope.go:117] "RemoveContainer" containerID="832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.548416 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.600211 4676 scope.go:117] "RemoveContainer" containerID="b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3" Dec 04 17:01:02 crc kubenswrapper[4676]: E1204 17:01:02.604018 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3\": container with ID starting with b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3 not found: ID does not exist" containerID="b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.604057 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3"} err="failed to get container status \"b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3\": rpc error: code = NotFound desc = could not find container \"b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3\": container with ID starting with b79414a68c732fff88b36c0fea1d81586509ad42ecf6f901da05ee6d27970da3 not found: ID does not exist" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.604084 4676 scope.go:117] "RemoveContainer" containerID="243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9" Dec 04 17:01:02 crc kubenswrapper[4676]: E1204 17:01:02.605436 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9\": container with ID starting with 243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9 not found: ID does not exist" containerID="243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.605478 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9"} err="failed to get container status \"243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9\": rpc error: code = NotFound desc = could not find container \"243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9\": container with ID starting with 243e1e7c1b354efb2233a222336f14e7a07fea30aa326ef73082dd83eb5ca9d9 not found: ID does not exist" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.605497 4676 scope.go:117] "RemoveContainer" containerID="832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3" Dec 04 17:01:02 crc kubenswrapper[4676]: E1204 17:01:02.605749 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3\": container with ID starting with 832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3 not found: ID does not exist" containerID="832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3" Dec 04 17:01:02 crc kubenswrapper[4676]: I1204 17:01:02.605788 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3"} err="failed to get container status \"832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3\": rpc error: code = NotFound desc = could not find container \"832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3\": container with ID starting with 832c20fbc12bf9ca4e3cc6d34f3db8cfd71d7008368324ec4c486e7675ef15d3 not found: ID does not exist" Dec 04 17:01:03 crc kubenswrapper[4676]: I1204 17:01:03.089386 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vp7r2"] Dec 04 17:01:03 crc kubenswrapper[4676]: W1204 17:01:03.094141 4676 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a085244_1140_4f9c_8966_cf2ce3d1d074.slice/crio-91f6929721a47e691b93f8aa860dc932ab079c56fcabe0793a8eecda2eadcc76 WatchSource:0}: Error finding container 91f6929721a47e691b93f8aa860dc932ab079c56fcabe0793a8eecda2eadcc76: Status 404 returned error can't find the container with id 91f6929721a47e691b93f8aa860dc932ab079c56fcabe0793a8eecda2eadcc76 Dec 04 17:01:03 crc kubenswrapper[4676]: I1204 17:01:03.413988 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7491c3bb-e6c6-43ac-9cb2-9dc4637fe596" path="/var/lib/kubelet/pods/7491c3bb-e6c6-43ac-9cb2-9dc4637fe596/volumes" Dec 04 17:01:03 crc kubenswrapper[4676]: I1204 17:01:03.462548 4676 generic.go:334] "Generic (PLEG): container finished" podID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerID="c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4" exitCode=0 Dec 04 17:01:03 crc kubenswrapper[4676]: I1204 17:01:03.462635 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vp7r2" event={"ID":"5a085244-1140-4f9c-8966-cf2ce3d1d074","Type":"ContainerDied","Data":"c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4"} Dec 04 17:01:03 crc kubenswrapper[4676]: I1204 17:01:03.462678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vp7r2" event={"ID":"5a085244-1140-4f9c-8966-cf2ce3d1d074","Type":"ContainerStarted","Data":"91f6929721a47e691b93f8aa860dc932ab079c56fcabe0793a8eecda2eadcc76"} Dec 04 17:01:04 crc kubenswrapper[4676]: I1204 17:01:04.418111 4676 scope.go:117] "RemoveContainer" containerID="50f9b1e03d8d94f70b8d649008570ece80b7625e773edfd50995b3c35a19dd70" Dec 04 17:01:04 crc kubenswrapper[4676]: I1204 17:01:04.474634 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vp7r2" event={"ID":"5a085244-1140-4f9c-8966-cf2ce3d1d074","Type":"ContainerStarted","Data":"7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9"} Dec 04 17:01:05 crc kubenswrapper[4676]: I1204 17:01:05.487309 4676 generic.go:334] "Generic (PLEG): container finished" podID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerID="7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9" exitCode=0 Dec 04 17:01:05 crc kubenswrapper[4676]: I1204 17:01:05.487428 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vp7r2" event={"ID":"5a085244-1140-4f9c-8966-cf2ce3d1d074","Type":"ContainerDied","Data":"7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9"} Dec 04 17:01:05 crc kubenswrapper[4676]: I1204 17:01:05.489895 4676 generic.go:334] "Generic (PLEG): container finished" podID="66d21b15-c63b-4b71-9c05-21ca7e59655a" containerID="6eeb3c7321143249969641aecad9e8411d7a4e19f317bbdb62fdf40faea3d741" exitCode=0 Dec 04 17:01:05 crc kubenswrapper[4676]: I1204 17:01:05.489964 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414461-7x2bj" event={"ID":"66d21b15-c63b-4b71-9c05-21ca7e59655a","Type":"ContainerDied","Data":"6eeb3c7321143249969641aecad9e8411d7a4e19f317bbdb62fdf40faea3d741"} Dec 04 17:01:06 crc kubenswrapper[4676]: I1204 17:01:06.524834 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vp7r2" event={"ID":"5a085244-1140-4f9c-8966-cf2ce3d1d074","Type":"ContainerStarted","Data":"dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f"} Dec 04 17:01:06 crc kubenswrapper[4676]: I1204 17:01:06.561188 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vp7r2" podStartSLOduration=2.155230011 podStartE2EDuration="4.561169277s" podCreationTimestamp="2025-12-04 17:01:02 +0000 UTC" firstStartedPulling="2025-12-04 17:01:03.467241832 +0000 UTC m=+6070.901911699" lastFinishedPulling="2025-12-04 17:01:05.873181108 +0000 UTC m=+6073.307850965" observedRunningTime="2025-12-04 17:01:06.552352706 +0000 UTC m=+6073.987022563" watchObservedRunningTime="2025-12-04 17:01:06.561169277 +0000 UTC m=+6073.995839134" Dec 04 17:01:06 crc kubenswrapper[4676]: I1204 17:01:06.895592 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.084350 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-combined-ca-bundle\") pod \"66d21b15-c63b-4b71-9c05-21ca7e59655a\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.084698 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqmfm\" (UniqueName: \"kubernetes.io/projected/66d21b15-c63b-4b71-9c05-21ca7e59655a-kube-api-access-bqmfm\") pod \"66d21b15-c63b-4b71-9c05-21ca7e59655a\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.084736 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-fernet-keys\") pod \"66d21b15-c63b-4b71-9c05-21ca7e59655a\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.084864 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-config-data\") pod \"66d21b15-c63b-4b71-9c05-21ca7e59655a\" (UID: \"66d21b15-c63b-4b71-9c05-21ca7e59655a\") " Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.097929 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "66d21b15-c63b-4b71-9c05-21ca7e59655a" (UID: "66d21b15-c63b-4b71-9c05-21ca7e59655a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.106304 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d21b15-c63b-4b71-9c05-21ca7e59655a-kube-api-access-bqmfm" (OuterVolumeSpecName: "kube-api-access-bqmfm") pod "66d21b15-c63b-4b71-9c05-21ca7e59655a" (UID: "66d21b15-c63b-4b71-9c05-21ca7e59655a"). InnerVolumeSpecName "kube-api-access-bqmfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.124581 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66d21b15-c63b-4b71-9c05-21ca7e59655a" (UID: "66d21b15-c63b-4b71-9c05-21ca7e59655a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.190704 4676 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.190743 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqmfm\" (UniqueName: \"kubernetes.io/projected/66d21b15-c63b-4b71-9c05-21ca7e59655a-kube-api-access-bqmfm\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.190754 4676 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.198809 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-config-data" (OuterVolumeSpecName: "config-data") pod "66d21b15-c63b-4b71-9c05-21ca7e59655a" (UID: "66d21b15-c63b-4b71-9c05-21ca7e59655a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.292588 4676 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66d21b15-c63b-4b71-9c05-21ca7e59655a-config-data\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:07 crc kubenswrapper[4676]: E1204 17:01:07.452251 4676 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66d21b15_c63b_4b71_9c05_21ca7e59655a.slice\": RecentStats: unable to find data in memory cache]" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.538408 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29414461-7x2bj" Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.538402 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29414461-7x2bj" event={"ID":"66d21b15-c63b-4b71-9c05-21ca7e59655a","Type":"ContainerDied","Data":"beee813ef2401f0126fb20e24b592f448e331895aa4bc0c157e142178e699dfa"} Dec 04 17:01:07 crc kubenswrapper[4676]: I1204 17:01:07.538480 4676 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beee813ef2401f0126fb20e24b592f448e331895aa4bc0c157e142178e699dfa" Dec 04 17:01:12 crc kubenswrapper[4676]: I1204 17:01:12.549110 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:12 crc kubenswrapper[4676]: I1204 17:01:12.549722 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:12 crc kubenswrapper[4676]: I1204 17:01:12.602479 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:12 crc kubenswrapper[4676]: I1204 17:01:12.659115 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:12 crc kubenswrapper[4676]: I1204 17:01:12.847745 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vp7r2"] Dec 04 17:01:14 crc kubenswrapper[4676]: I1204 17:01:14.707243 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vp7r2" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="registry-server" containerID="cri-o://dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f" gracePeriod=2 Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.487262 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.622189 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hknvn\" (UniqueName: \"kubernetes.io/projected/5a085244-1140-4f9c-8966-cf2ce3d1d074-kube-api-access-hknvn\") pod \"5a085244-1140-4f9c-8966-cf2ce3d1d074\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.622409 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-catalog-content\") pod \"5a085244-1140-4f9c-8966-cf2ce3d1d074\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.622467 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-utilities\") pod \"5a085244-1140-4f9c-8966-cf2ce3d1d074\" (UID: \"5a085244-1140-4f9c-8966-cf2ce3d1d074\") " Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.623300 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-utilities" (OuterVolumeSpecName: "utilities") pod "5a085244-1140-4f9c-8966-cf2ce3d1d074" (UID: "5a085244-1140-4f9c-8966-cf2ce3d1d074"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.630660 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a085244-1140-4f9c-8966-cf2ce3d1d074-kube-api-access-hknvn" (OuterVolumeSpecName: "kube-api-access-hknvn") pod "5a085244-1140-4f9c-8966-cf2ce3d1d074" (UID: "5a085244-1140-4f9c-8966-cf2ce3d1d074"). InnerVolumeSpecName "kube-api-access-hknvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.800460 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.801466 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hknvn\" (UniqueName: \"kubernetes.io/projected/5a085244-1140-4f9c-8966-cf2ce3d1d074-kube-api-access-hknvn\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.804087 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a085244-1140-4f9c-8966-cf2ce3d1d074" (UID: "5a085244-1140-4f9c-8966-cf2ce3d1d074"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.812355 4676 generic.go:334] "Generic (PLEG): container finished" podID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerID="dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f" exitCode=0 Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.812411 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vp7r2" event={"ID":"5a085244-1140-4f9c-8966-cf2ce3d1d074","Type":"ContainerDied","Data":"dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f"} Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.812448 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vp7r2" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.812474 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vp7r2" event={"ID":"5a085244-1140-4f9c-8966-cf2ce3d1d074","Type":"ContainerDied","Data":"91f6929721a47e691b93f8aa860dc932ab079c56fcabe0793a8eecda2eadcc76"} Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.812499 4676 scope.go:117] "RemoveContainer" containerID="dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.836368 4676 scope.go:117] "RemoveContainer" containerID="7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.862189 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vp7r2"] Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.871702 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vp7r2"] Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.877692 4676 scope.go:117] "RemoveContainer" containerID="c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.899058 4676 scope.go:117] "RemoveContainer" containerID="dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f" Dec 04 17:01:15 crc kubenswrapper[4676]: E1204 17:01:15.901338 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f\": container with ID starting with dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f not found: ID does not exist" containerID="dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.901374 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f"} err="failed to get container status \"dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f\": rpc error: code = NotFound desc = could not find container \"dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f\": container with ID starting with dccb4b9dfbfc7ff86378f49a33b8a2b32b95349225b43c5995e67979de6b1c4f not found: ID does not exist" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.901395 4676 scope.go:117] "RemoveContainer" containerID="7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9" Dec 04 17:01:15 crc kubenswrapper[4676]: E1204 17:01:15.901699 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9\": container with ID starting with 7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9 not found: ID does not exist" containerID="7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.901775 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9"} err="failed to get container status \"7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9\": rpc error: code = NotFound desc = could not find container \"7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9\": container with ID starting with 7188952ea69c8541da3f538d846fb212a15e8c344974baa33043b5d45b1f74d9 not found: ID does not exist" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.901792 4676 scope.go:117] "RemoveContainer" containerID="c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4" Dec 04 17:01:15 crc kubenswrapper[4676]: E1204 17:01:15.902025 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4\": container with ID starting with c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4 not found: ID does not exist" containerID="c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.902053 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4"} err="failed to get container status \"c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4\": rpc error: code = NotFound desc = could not find container \"c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4\": container with ID starting with c71e6bd42d23ddaaa0e5b912aaaae9029b712b293559182c14decf6edfd6cae4 not found: ID does not exist" Dec 04 17:01:15 crc kubenswrapper[4676]: I1204 17:01:15.902944 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a085244-1140-4f9c-8966-cf2ce3d1d074-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:01:17 crc kubenswrapper[4676]: I1204 17:01:17.398297 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" path="/var/lib/kubelet/pods/5a085244-1140-4f9c-8966-cf2ce3d1d074/volumes" Dec 04 17:01:46 crc kubenswrapper[4676]: I1204 17:01:46.026615 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:01:46 crc kubenswrapper[4676]: I1204 17:01:46.027442 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:01:52 crc kubenswrapper[4676]: I1204 17:01:52.715635 4676 generic.go:334] "Generic (PLEG): container finished" podID="76d769e1-ed6f-4192-bee8-d36d31249051" containerID="c9e5428563fb1411cf59b64f42b8df2dfd17924001cfad837208a417370ff854" exitCode=0 Dec 04 17:01:52 crc kubenswrapper[4676]: I1204 17:01:52.715726 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-99fmm/must-gather-7bh8h" event={"ID":"76d769e1-ed6f-4192-bee8-d36d31249051","Type":"ContainerDied","Data":"c9e5428563fb1411cf59b64f42b8df2dfd17924001cfad837208a417370ff854"} Dec 04 17:01:52 crc kubenswrapper[4676]: I1204 17:01:52.717088 4676 scope.go:117] "RemoveContainer" containerID="c9e5428563fb1411cf59b64f42b8df2dfd17924001cfad837208a417370ff854" Dec 04 17:01:52 crc kubenswrapper[4676]: I1204 17:01:52.872920 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99fmm_must-gather-7bh8h_76d769e1-ed6f-4192-bee8-d36d31249051/gather/0.log" Dec 04 17:02:01 crc kubenswrapper[4676]: I1204 17:02:01.506261 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-99fmm/must-gather-7bh8h"] Dec 04 17:02:01 crc kubenswrapper[4676]: I1204 17:02:01.507655 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-99fmm/must-gather-7bh8h" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" containerName="copy" containerID="cri-o://ed34e5bc679f22102152d411699172f301b47fd29bf87582b434113fd0617af7" gracePeriod=2 Dec 04 17:02:01 crc kubenswrapper[4676]: I1204 17:02:01.522511 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-99fmm/must-gather-7bh8h"] Dec 04 17:02:01 crc kubenswrapper[4676]: I1204 17:02:01.825271 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99fmm_must-gather-7bh8h_76d769e1-ed6f-4192-bee8-d36d31249051/copy/0.log" Dec 04 17:02:01 crc kubenswrapper[4676]: I1204 17:02:01.826006 4676 generic.go:334] "Generic (PLEG): container finished" podID="76d769e1-ed6f-4192-bee8-d36d31249051" containerID="ed34e5bc679f22102152d411699172f301b47fd29bf87582b434113fd0617af7" exitCode=143 Dec 04 17:02:01 crc kubenswrapper[4676]: I1204 17:02:01.990351 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99fmm_must-gather-7bh8h_76d769e1-ed6f-4192-bee8-d36d31249051/copy/0.log" Dec 04 17:02:01 crc kubenswrapper[4676]: I1204 17:02:01.991251 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.158744 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d769e1-ed6f-4192-bee8-d36d31249051-must-gather-output\") pod \"76d769e1-ed6f-4192-bee8-d36d31249051\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.158890 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q486r\" (UniqueName: \"kubernetes.io/projected/76d769e1-ed6f-4192-bee8-d36d31249051-kube-api-access-q486r\") pod \"76d769e1-ed6f-4192-bee8-d36d31249051\" (UID: \"76d769e1-ed6f-4192-bee8-d36d31249051\") " Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.166438 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d769e1-ed6f-4192-bee8-d36d31249051-kube-api-access-q486r" (OuterVolumeSpecName: "kube-api-access-q486r") pod "76d769e1-ed6f-4192-bee8-d36d31249051" (UID: "76d769e1-ed6f-4192-bee8-d36d31249051"). InnerVolumeSpecName "kube-api-access-q486r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.260769 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q486r\" (UniqueName: \"kubernetes.io/projected/76d769e1-ed6f-4192-bee8-d36d31249051-kube-api-access-q486r\") on node \"crc\" DevicePath \"\"" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.344376 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d769e1-ed6f-4192-bee8-d36d31249051-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "76d769e1-ed6f-4192-bee8-d36d31249051" (UID: "76d769e1-ed6f-4192-bee8-d36d31249051"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.362445 4676 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/76d769e1-ed6f-4192-bee8-d36d31249051-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.839478 4676 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-99fmm_must-gather-7bh8h_76d769e1-ed6f-4192-bee8-d36d31249051/copy/0.log" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.839858 4676 scope.go:117] "RemoveContainer" containerID="ed34e5bc679f22102152d411699172f301b47fd29bf87582b434113fd0617af7" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.840076 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-99fmm/must-gather-7bh8h" Dec 04 17:02:02 crc kubenswrapper[4676]: I1204 17:02:02.878148 4676 scope.go:117] "RemoveContainer" containerID="c9e5428563fb1411cf59b64f42b8df2dfd17924001cfad837208a417370ff854" Dec 04 17:02:03 crc kubenswrapper[4676]: I1204 17:02:03.405880 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" path="/var/lib/kubelet/pods/76d769e1-ed6f-4192-bee8-d36d31249051/volumes" Dec 04 17:02:04 crc kubenswrapper[4676]: I1204 17:02:04.500548 4676 scope.go:117] "RemoveContainer" containerID="58e48defa5d5b71438c3f2382d2e1180f6ded91aca0510db43478fd5097ae70a" Dec 04 17:02:16 crc kubenswrapper[4676]: I1204 17:02:16.026756 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:02:16 crc kubenswrapper[4676]: I1204 17:02:16.027340 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.026501 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.027410 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.027553 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.029129 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c3dbdc80a9ba931ebb1a3965991f32c779f274f9df67d1cdb31508db343b2eb5"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.029263 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://c3dbdc80a9ba931ebb1a3965991f32c779f274f9df67d1cdb31508db343b2eb5" gracePeriod=600 Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.378480 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="c3dbdc80a9ba931ebb1a3965991f32c779f274f9df67d1cdb31508db343b2eb5" exitCode=0 Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.378705 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"c3dbdc80a9ba931ebb1a3965991f32c779f274f9df67d1cdb31508db343b2eb5"} Dec 04 17:02:46 crc kubenswrapper[4676]: I1204 17:02:46.378831 4676 scope.go:117] "RemoveContainer" containerID="3d5b9c62e8a5101ce9f2207c1c987eac3c2aeef14c37b7a6a503b1d39163b77c" Dec 04 17:02:47 crc kubenswrapper[4676]: I1204 17:02:47.396548 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerStarted","Data":"4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780"} Dec 04 17:03:04 crc kubenswrapper[4676]: I1204 17:03:04.623416 4676 scope.go:117] "RemoveContainer" containerID="78a96f6e959007c020442e2b51fa3f90ff10af3a64b87eab75578b2e5c0209ff" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.397060 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vq6bk"] Dec 04 17:03:09 crc kubenswrapper[4676]: E1204 17:03:09.397807 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d21b15-c63b-4b71-9c05-21ca7e59655a" containerName="keystone-cron" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.397829 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d21b15-c63b-4b71-9c05-21ca7e59655a" containerName="keystone-cron" Dec 04 17:03:09 crc kubenswrapper[4676]: E1204 17:03:09.397842 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="extract-utilities" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.397848 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="extract-utilities" Dec 04 17:03:09 crc kubenswrapper[4676]: E1204 17:03:09.397873 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" containerName="copy" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.397879 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" containerName="copy" Dec 04 17:03:09 crc kubenswrapper[4676]: E1204 17:03:09.397894 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="extract-content" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.399031 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="extract-content" Dec 04 17:03:09 crc kubenswrapper[4676]: E1204 17:03:09.399094 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="registry-server" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.399105 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="registry-server" Dec 04 17:03:09 crc kubenswrapper[4676]: E1204 17:03:09.399134 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" containerName="gather" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.399142 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" containerName="gather" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.399504 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" containerName="copy" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.399535 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d21b15-c63b-4b71-9c05-21ca7e59655a" containerName="keystone-cron" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.399550 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a085244-1140-4f9c-8966-cf2ce3d1d074" containerName="registry-server" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.399566 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d769e1-ed6f-4192-bee8-d36d31249051" containerName="gather" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.401058 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.406892 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vq6bk"] Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.579872 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k5qf\" (UniqueName: \"kubernetes.io/projected/b500b72a-f74c-4625-9b65-12f01a560fe1-kube-api-access-7k5qf\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.579962 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-utilities\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.580250 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-catalog-content\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.682270 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k5qf\" (UniqueName: \"kubernetes.io/projected/b500b72a-f74c-4625-9b65-12f01a560fe1-kube-api-access-7k5qf\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.682336 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-utilities\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.682475 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-catalog-content\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.683004 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-catalog-content\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.683456 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-utilities\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.714229 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k5qf\" (UniqueName: \"kubernetes.io/projected/b500b72a-f74c-4625-9b65-12f01a560fe1-kube-api-access-7k5qf\") pod \"certified-operators-vq6bk\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:09 crc kubenswrapper[4676]: I1204 17:03:09.720678 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:10 crc kubenswrapper[4676]: I1204 17:03:10.287898 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vq6bk"] Dec 04 17:03:10 crc kubenswrapper[4676]: I1204 17:03:10.663058 4676 generic.go:334] "Generic (PLEG): container finished" podID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerID="189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50" exitCode=0 Dec 04 17:03:10 crc kubenswrapper[4676]: I1204 17:03:10.663154 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vq6bk" event={"ID":"b500b72a-f74c-4625-9b65-12f01a560fe1","Type":"ContainerDied","Data":"189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50"} Dec 04 17:03:10 crc kubenswrapper[4676]: I1204 17:03:10.663538 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vq6bk" event={"ID":"b500b72a-f74c-4625-9b65-12f01a560fe1","Type":"ContainerStarted","Data":"2702cd56524032bc9d8733f459c9eefb795e3c5287b4088318b6d01a379ffc2f"} Dec 04 17:03:13 crc kubenswrapper[4676]: I1204 17:03:13.699051 4676 generic.go:334] "Generic (PLEG): container finished" podID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerID="d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72" exitCode=0 Dec 04 17:03:13 crc kubenswrapper[4676]: I1204 17:03:13.699113 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vq6bk" event={"ID":"b500b72a-f74c-4625-9b65-12f01a560fe1","Type":"ContainerDied","Data":"d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72"} Dec 04 17:03:14 crc kubenswrapper[4676]: I1204 17:03:14.713399 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vq6bk" event={"ID":"b500b72a-f74c-4625-9b65-12f01a560fe1","Type":"ContainerStarted","Data":"787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd"} Dec 04 17:03:14 crc kubenswrapper[4676]: I1204 17:03:14.743549 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vq6bk" podStartSLOduration=2.066421918 podStartE2EDuration="5.743528073s" podCreationTimestamp="2025-12-04 17:03:09 +0000 UTC" firstStartedPulling="2025-12-04 17:03:10.664920541 +0000 UTC m=+6198.099590398" lastFinishedPulling="2025-12-04 17:03:14.342026666 +0000 UTC m=+6201.776696553" observedRunningTime="2025-12-04 17:03:14.734756053 +0000 UTC m=+6202.169425920" watchObservedRunningTime="2025-12-04 17:03:14.743528073 +0000 UTC m=+6202.178197940" Dec 04 17:03:19 crc kubenswrapper[4676]: I1204 17:03:19.721960 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:19 crc kubenswrapper[4676]: I1204 17:03:19.723056 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:19 crc kubenswrapper[4676]: I1204 17:03:19.777677 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:20 crc kubenswrapper[4676]: I1204 17:03:20.718575 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:20 crc kubenswrapper[4676]: I1204 17:03:20.779088 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vq6bk"] Dec 04 17:03:22 crc kubenswrapper[4676]: I1204 17:03:22.633491 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vq6bk" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="registry-server" containerID="cri-o://787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd" gracePeriod=2 Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.099777 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.138822 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k5qf\" (UniqueName: \"kubernetes.io/projected/b500b72a-f74c-4625-9b65-12f01a560fe1-kube-api-access-7k5qf\") pod \"b500b72a-f74c-4625-9b65-12f01a560fe1\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.138982 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-catalog-content\") pod \"b500b72a-f74c-4625-9b65-12f01a560fe1\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.139089 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-utilities\") pod \"b500b72a-f74c-4625-9b65-12f01a560fe1\" (UID: \"b500b72a-f74c-4625-9b65-12f01a560fe1\") " Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.140503 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-utilities" (OuterVolumeSpecName: "utilities") pod "b500b72a-f74c-4625-9b65-12f01a560fe1" (UID: "b500b72a-f74c-4625-9b65-12f01a560fe1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.148340 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b500b72a-f74c-4625-9b65-12f01a560fe1-kube-api-access-7k5qf" (OuterVolumeSpecName: "kube-api-access-7k5qf") pod "b500b72a-f74c-4625-9b65-12f01a560fe1" (UID: "b500b72a-f74c-4625-9b65-12f01a560fe1"). InnerVolumeSpecName "kube-api-access-7k5qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.208431 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b500b72a-f74c-4625-9b65-12f01a560fe1" (UID: "b500b72a-f74c-4625-9b65-12f01a560fe1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.244454 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.244491 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b500b72a-f74c-4625-9b65-12f01a560fe1-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.244502 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k5qf\" (UniqueName: \"kubernetes.io/projected/b500b72a-f74c-4625-9b65-12f01a560fe1-kube-api-access-7k5qf\") on node \"crc\" DevicePath \"\"" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.650216 4676 generic.go:334] "Generic (PLEG): container finished" podID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerID="787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd" exitCode=0 Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.650283 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vq6bk" event={"ID":"b500b72a-f74c-4625-9b65-12f01a560fe1","Type":"ContainerDied","Data":"787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd"} Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.650324 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vq6bk" event={"ID":"b500b72a-f74c-4625-9b65-12f01a560fe1","Type":"ContainerDied","Data":"2702cd56524032bc9d8733f459c9eefb795e3c5287b4088318b6d01a379ffc2f"} Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.650355 4676 scope.go:117] "RemoveContainer" containerID="787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.650550 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vq6bk" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.695345 4676 scope.go:117] "RemoveContainer" containerID="d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.699460 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vq6bk"] Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.709070 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vq6bk"] Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.723797 4676 scope.go:117] "RemoveContainer" containerID="189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.789448 4676 scope.go:117] "RemoveContainer" containerID="787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd" Dec 04 17:03:23 crc kubenswrapper[4676]: E1204 17:03:23.790038 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd\": container with ID starting with 787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd not found: ID does not exist" containerID="787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.791499 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd"} err="failed to get container status \"787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd\": rpc error: code = NotFound desc = could not find container \"787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd\": container with ID starting with 787067c567f099679f7c5928c33d4849564e29bc20c3fde9210d725164d51fbd not found: ID does not exist" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.791552 4676 scope.go:117] "RemoveContainer" containerID="d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72" Dec 04 17:03:23 crc kubenswrapper[4676]: E1204 17:03:23.791937 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72\": container with ID starting with d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72 not found: ID does not exist" containerID="d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.791969 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72"} err="failed to get container status \"d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72\": rpc error: code = NotFound desc = could not find container \"d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72\": container with ID starting with d96a347bdd60ce182ae3ad69a9caa513b36d6655c6da6bc7ec77191ad9fb4a72 not found: ID does not exist" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.791987 4676 scope.go:117] "RemoveContainer" containerID="189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50" Dec 04 17:03:23 crc kubenswrapper[4676]: E1204 17:03:23.792212 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50\": container with ID starting with 189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50 not found: ID does not exist" containerID="189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50" Dec 04 17:03:23 crc kubenswrapper[4676]: I1204 17:03:23.792238 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50"} err="failed to get container status \"189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50\": rpc error: code = NotFound desc = could not find container \"189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50\": container with ID starting with 189b86df9c3c1bcee3574e3bedca8547f37efbd9426d8b4f000c7aaa0451cb50 not found: ID does not exist" Dec 04 17:03:25 crc kubenswrapper[4676]: I1204 17:03:25.395484 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" path="/var/lib/kubelet/pods/b500b72a-f74c-4625-9b65-12f01a560fe1/volumes" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.415632 4676 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nzl8f"] Dec 04 17:04:18 crc kubenswrapper[4676]: E1204 17:04:18.417980 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="extract-utilities" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.418075 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="extract-utilities" Dec 04 17:04:18 crc kubenswrapper[4676]: E1204 17:04:18.418145 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="extract-content" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.418206 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="extract-content" Dec 04 17:04:18 crc kubenswrapper[4676]: E1204 17:04:18.418302 4676 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="registry-server" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.418363 4676 state_mem.go:107] "Deleted CPUSet assignment" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="registry-server" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.418653 4676 memory_manager.go:354] "RemoveStaleState removing state" podUID="b500b72a-f74c-4625-9b65-12f01a560fe1" containerName="registry-server" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.429872 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzl8f"] Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.430027 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.612315 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqkm\" (UniqueName: \"kubernetes.io/projected/f59a4482-a7d4-46fa-88c7-19e5b99896da-kube-api-access-2sqkm\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.612637 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-catalog-content\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.612856 4676 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-utilities\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.715017 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-utilities\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.715145 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqkm\" (UniqueName: \"kubernetes.io/projected/f59a4482-a7d4-46fa-88c7-19e5b99896da-kube-api-access-2sqkm\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.715193 4676 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-catalog-content\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.715513 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-utilities\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.715539 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-catalog-content\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.737794 4676 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqkm\" (UniqueName: \"kubernetes.io/projected/f59a4482-a7d4-46fa-88c7-19e5b99896da-kube-api-access-2sqkm\") pod \"redhat-marketplace-nzl8f\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:18 crc kubenswrapper[4676]: I1204 17:04:18.768750 4676 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:19 crc kubenswrapper[4676]: I1204 17:04:19.269374 4676 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzl8f"] Dec 04 17:04:20 crc kubenswrapper[4676]: I1204 17:04:20.353714 4676 generic.go:334] "Generic (PLEG): container finished" podID="f59a4482-a7d4-46fa-88c7-19e5b99896da" containerID="7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17" exitCode=0 Dec 04 17:04:20 crc kubenswrapper[4676]: I1204 17:04:20.353927 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzl8f" event={"ID":"f59a4482-a7d4-46fa-88c7-19e5b99896da","Type":"ContainerDied","Data":"7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17"} Dec 04 17:04:20 crc kubenswrapper[4676]: I1204 17:04:20.354082 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzl8f" event={"ID":"f59a4482-a7d4-46fa-88c7-19e5b99896da","Type":"ContainerStarted","Data":"2caa3c2fac2400fd9001eeb3fa77b2a5ce07663cdf2443e200febefc0c5daa7c"} Dec 04 17:04:21 crc kubenswrapper[4676]: I1204 17:04:21.366386 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzl8f" event={"ID":"f59a4482-a7d4-46fa-88c7-19e5b99896da","Type":"ContainerStarted","Data":"e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5"} Dec 04 17:04:22 crc kubenswrapper[4676]: I1204 17:04:22.383764 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzl8f" event={"ID":"f59a4482-a7d4-46fa-88c7-19e5b99896da","Type":"ContainerDied","Data":"e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5"} Dec 04 17:04:22 crc kubenswrapper[4676]: I1204 17:04:22.383721 4676 generic.go:334] "Generic (PLEG): container finished" podID="f59a4482-a7d4-46fa-88c7-19e5b99896da" containerID="e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5" exitCode=0 Dec 04 17:04:23 crc kubenswrapper[4676]: I1204 17:04:23.407678 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzl8f" event={"ID":"f59a4482-a7d4-46fa-88c7-19e5b99896da","Type":"ContainerStarted","Data":"62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890"} Dec 04 17:04:23 crc kubenswrapper[4676]: I1204 17:04:23.439604 4676 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nzl8f" podStartSLOduration=2.968384358 podStartE2EDuration="5.43948721s" podCreationTimestamp="2025-12-04 17:04:18 +0000 UTC" firstStartedPulling="2025-12-04 17:04:20.356474622 +0000 UTC m=+6267.791144469" lastFinishedPulling="2025-12-04 17:04:22.827577414 +0000 UTC m=+6270.262247321" observedRunningTime="2025-12-04 17:04:23.430050491 +0000 UTC m=+6270.864720348" watchObservedRunningTime="2025-12-04 17:04:23.43948721 +0000 UTC m=+6270.874157067" Dec 04 17:04:28 crc kubenswrapper[4676]: I1204 17:04:28.769300 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:28 crc kubenswrapper[4676]: I1204 17:04:28.769864 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:28 crc kubenswrapper[4676]: I1204 17:04:28.816299 4676 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:29 crc kubenswrapper[4676]: I1204 17:04:29.539956 4676 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:29 crc kubenswrapper[4676]: I1204 17:04:29.588233 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzl8f"] Dec 04 17:04:31 crc kubenswrapper[4676]: I1204 17:04:31.527116 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nzl8f" podUID="f59a4482-a7d4-46fa-88c7-19e5b99896da" containerName="registry-server" containerID="cri-o://62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890" gracePeriod=2 Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.031484 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.055050 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-utilities\") pod \"f59a4482-a7d4-46fa-88c7-19e5b99896da\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.055101 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sqkm\" (UniqueName: \"kubernetes.io/projected/f59a4482-a7d4-46fa-88c7-19e5b99896da-kube-api-access-2sqkm\") pod \"f59a4482-a7d4-46fa-88c7-19e5b99896da\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.055180 4676 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-catalog-content\") pod \"f59a4482-a7d4-46fa-88c7-19e5b99896da\" (UID: \"f59a4482-a7d4-46fa-88c7-19e5b99896da\") " Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.059661 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-utilities" (OuterVolumeSpecName: "utilities") pod "f59a4482-a7d4-46fa-88c7-19e5b99896da" (UID: "f59a4482-a7d4-46fa-88c7-19e5b99896da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.068295 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f59a4482-a7d4-46fa-88c7-19e5b99896da-kube-api-access-2sqkm" (OuterVolumeSpecName: "kube-api-access-2sqkm") pod "f59a4482-a7d4-46fa-88c7-19e5b99896da" (UID: "f59a4482-a7d4-46fa-88c7-19e5b99896da"). InnerVolumeSpecName "kube-api-access-2sqkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.094090 4676 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f59a4482-a7d4-46fa-88c7-19e5b99896da" (UID: "f59a4482-a7d4-46fa-88c7-19e5b99896da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.157490 4676 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-utilities\") on node \"crc\" DevicePath \"\"" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.157532 4676 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sqkm\" (UniqueName: \"kubernetes.io/projected/f59a4482-a7d4-46fa-88c7-19e5b99896da-kube-api-access-2sqkm\") on node \"crc\" DevicePath \"\"" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.157543 4676 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f59a4482-a7d4-46fa-88c7-19e5b99896da-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.541021 4676 generic.go:334] "Generic (PLEG): container finished" podID="f59a4482-a7d4-46fa-88c7-19e5b99896da" containerID="62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890" exitCode=0 Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.541069 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzl8f" event={"ID":"f59a4482-a7d4-46fa-88c7-19e5b99896da","Type":"ContainerDied","Data":"62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890"} Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.541100 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nzl8f" event={"ID":"f59a4482-a7d4-46fa-88c7-19e5b99896da","Type":"ContainerDied","Data":"2caa3c2fac2400fd9001eeb3fa77b2a5ce07663cdf2443e200febefc0c5daa7c"} Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.541120 4676 scope.go:117] "RemoveContainer" containerID="62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.541120 4676 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nzl8f" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.565998 4676 scope.go:117] "RemoveContainer" containerID="e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.588101 4676 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzl8f"] Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.607151 4676 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nzl8f"] Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.615056 4676 scope.go:117] "RemoveContainer" containerID="7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.659032 4676 scope.go:117] "RemoveContainer" containerID="62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890" Dec 04 17:04:32 crc kubenswrapper[4676]: E1204 17:04:32.660213 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890\": container with ID starting with 62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890 not found: ID does not exist" containerID="62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.660258 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890"} err="failed to get container status \"62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890\": rpc error: code = NotFound desc = could not find container \"62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890\": container with ID starting with 62e44ba1ff6cd21e00917c86b0f4bae5fbe64f33101511fa2dac14504f767890 not found: ID does not exist" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.660313 4676 scope.go:117] "RemoveContainer" containerID="e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5" Dec 04 17:04:32 crc kubenswrapper[4676]: E1204 17:04:32.660736 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5\": container with ID starting with e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5 not found: ID does not exist" containerID="e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.660771 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5"} err="failed to get container status \"e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5\": rpc error: code = NotFound desc = could not find container \"e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5\": container with ID starting with e1a8e6cdaf9b5fa947131f111078fb756309872a579248f4a01b1cbde803b5c5 not found: ID does not exist" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.660801 4676 scope.go:117] "RemoveContainer" containerID="7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17" Dec 04 17:04:32 crc kubenswrapper[4676]: E1204 17:04:32.662394 4676 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17\": container with ID starting with 7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17 not found: ID does not exist" containerID="7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17" Dec 04 17:04:32 crc kubenswrapper[4676]: I1204 17:04:32.662424 4676 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17"} err="failed to get container status \"7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17\": rpc error: code = NotFound desc = could not find container \"7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17\": container with ID starting with 7437d079d817798a943bbf448bdf5f1a1a0e7409eb559451ede9e9172ff98a17 not found: ID does not exist" Dec 04 17:04:33 crc kubenswrapper[4676]: I1204 17:04:33.405422 4676 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f59a4482-a7d4-46fa-88c7-19e5b99896da" path="/var/lib/kubelet/pods/f59a4482-a7d4-46fa-88c7-19e5b99896da/volumes" Dec 04 17:04:46 crc kubenswrapper[4676]: I1204 17:04:46.026529 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:04:46 crc kubenswrapper[4676]: I1204 17:04:46.029663 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:05:16 crc kubenswrapper[4676]: I1204 17:05:16.027415 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:05:16 crc kubenswrapper[4676]: I1204 17:05:16.028150 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.026718 4676 patch_prober.go:28] interesting pod/machine-config-daemon-5s6p9 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.027291 4676 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.027338 4676 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.028332 4676 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780"} pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.028392 4676 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerName="machine-config-daemon" containerID="cri-o://4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" gracePeriod=600 Dec 04 17:05:46 crc kubenswrapper[4676]: E1204 17:05:46.151834 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.487270 4676 generic.go:334] "Generic (PLEG): container finished" podID="b3eca9b5-0269-40ad-8bc1-142e702d9454" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" exitCode=0 Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.487341 4676 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" event={"ID":"b3eca9b5-0269-40ad-8bc1-142e702d9454","Type":"ContainerDied","Data":"4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780"} Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.488012 4676 scope.go:117] "RemoveContainer" containerID="c3dbdc80a9ba931ebb1a3965991f32c779f274f9df67d1cdb31508db343b2eb5" Dec 04 17:05:46 crc kubenswrapper[4676]: I1204 17:05:46.489010 4676 scope.go:117] "RemoveContainer" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" Dec 04 17:05:46 crc kubenswrapper[4676]: E1204 17:05:46.489517 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 17:05:59 crc kubenswrapper[4676]: I1204 17:05:59.384782 4676 scope.go:117] "RemoveContainer" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" Dec 04 17:05:59 crc kubenswrapper[4676]: E1204 17:05:59.385531 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 17:06:12 crc kubenswrapper[4676]: I1204 17:06:12.384501 4676 scope.go:117] "RemoveContainer" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" Dec 04 17:06:12 crc kubenswrapper[4676]: E1204 17:06:12.385349 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 17:06:24 crc kubenswrapper[4676]: I1204 17:06:24.385917 4676 scope.go:117] "RemoveContainer" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" Dec 04 17:06:24 crc kubenswrapper[4676]: E1204 17:06:24.386609 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 17:06:38 crc kubenswrapper[4676]: I1204 17:06:38.385425 4676 scope.go:117] "RemoveContainer" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" Dec 04 17:06:38 crc kubenswrapper[4676]: E1204 17:06:38.386244 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 17:06:51 crc kubenswrapper[4676]: I1204 17:06:51.390579 4676 scope.go:117] "RemoveContainer" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" Dec 04 17:06:51 crc kubenswrapper[4676]: E1204 17:06:51.391789 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454" Dec 04 17:07:06 crc kubenswrapper[4676]: I1204 17:07:06.384997 4676 scope.go:117] "RemoveContainer" containerID="4bd87cb2b81e230c8f59f8a937ae222dabba256b17ba2b16bece6a743aba2780" Dec 04 17:07:06 crc kubenswrapper[4676]: E1204 17:07:06.385921 4676 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5s6p9_openshift-machine-config-operator(b3eca9b5-0269-40ad-8bc1-142e702d9454)\"" pod="openshift-machine-config-operator/machine-config-daemon-5s6p9" podUID="b3eca9b5-0269-40ad-8bc1-142e702d9454"